Python Lab Update

Submitted by:

Piu Mallick | Email id: pim16@pitt.edu | ID: 4374215

Debdas Ghosh | Email id: deg107@pitt.edu | ID: 4366821

In [1]:
# Getting working directory
import os
import math
import seaborn as sns
sns.set(style='white', rc={'figure.figsize':(10,10)})
os.getcwd()
/Users/piumallick/anaconda3/lib/python3.7/site-packages/statsmodels/tools/_testing.py:19: FutureWarning: pandas.util.testing is deprecated. Use the functions in the public API at pandas.testing instead.
  import pandas.util.testing as tm
Out[1]:
'/Users/piumallick/Documents/2ndSemester/INFSCI2160-DataMining/Python Lab Latest'
In [2]:
# Ignore warnings
import warnings

def fxn():
    warnings.warn("deprecated", DeprecationWarning)
with warnings.catch_warnings():
    warnings.simplefilter("ignore")
    fxn()
In [3]:
# Loading necessary libraries
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
from scipy.stats import skew
pd.set_option("display.max_rows", None)
pd.set_option("display.max_columns", None)
from sklearn import preprocessing
In [4]:
# Loading the dataset
df = pd.read_csv('PYTHON_LAB_TRAIN.csv')
In [5]:
# Checking sample data
df.head()
Out[5]:
SCHED_SURG_AREA RACE ETHNIC_GROUP PROC_DATE SCHED_HOSPITAL CREATE_DT_TM SCHED_START_DT_TM SCHED_SURG_PROC_CD FEMALE AGE_ON_CONTACT_DATE BMI WEIGHT BP_SYSTOLIC BP_DIASTOLIC PULSE PCPVISIT METFORMIN_FLAG OPIOIDS_FLAG ALPHA_BLOCKERS CENTRAL_ANTAGONISTS RENIN BETA_BLOCKERS ACE_INHIB ARB ALDOSTERONE_BLOCKERS VASODIALATORS DIURETICS CALCIUM_BLOCKERS STATINS INSULIN_MEDS ASPIRIN WARFARIN DOACS PRETERM_17P MEDROL PREDNISONE INHALED_STEROID_WITH_LABA INHALED_STEROID_WITHOUT_LABA INHALED_STEROIDS ASTHMA_BIOLOGICS SHORT_ACTING_BRONCHO_DIALATORS TNF_INHIBITORS IMMUNOMODULATORS AMINOSALICYLATES CORTICOSTEROIDS ARNI ALLOPURINOL SEIZURE MUSCLERELAXANT DIGOXIN INOTROPES ANTI_ARRHYTHMIC ANTIPLATELET SULFONYLUREA GLP_1_AGONIST THIAZOLIDINEDIONE SGLT2_INHIBITOR DPP4_INHIBITOR ALPHA_GLUCOSIDASE_INHIBITOR AMYLINOMIMETIC RAPID_ACTING_INSULIN SHORT_ACTING_INSULIN INTERMEDIATE_ACTING_INSULIN LONG_ACTING_INSULIN MINOCYCLINE DOXYCYCLINE MELATONIN METHAZOLAMIDE HYDROXYCHLOROQUINE ITTC DMARDS OBESE_HST MORBIDOBESE_HST PH_HST AFIB_HST COPD_HST CHF_HST DIAB_HST CAD_HST OSTEO_HST HTN_HST CANCER_HST LUNG_CANCER_HST OVARIAN_CANCER_HST HEAD_NECK_CANCER_HST BREAST_CANCER_HST ASTHMA_HST GERD_HST FIBROMYALGIA_HST DEPRESSION_HST PSORIATIC_ARTHRITIS_HST RHEUM_ARTHRITIS_HST LUPUS_HST VTVF_HST STROKE_HST VASCULARDISEASE_HST LOWBACKPAIN_HST DVT_HST PE_HST HYPOTHYROIDISM_HST ADRENAL_INSUFFICIENCY_HST INFERTILITY_HST CKD_HST ESRD_HST OBS_SLEEPAPNEA_HST CARDIAC_ARREST_HST HEMO_STROKE_HST MAJOR_BLEED_HST MACULAR_DEGEN_HST ANXIETY_HST HYPERLIPIDEMIA_HST HIV_HST ALZHEIMER_HST COLORECTAL_CANCER_HST ENDOMETRIAL_CANCER_HST GLAUCOMA_HST HIP_PELVIC_FRACTURE_HST BENIGN_PROSTATIC_HYPERPLASIA_HST CIRRHOSIS_HST CIRRHOSIS_HST_1 CHOLESTEROL_CLOSEST HDL_CLOSEST LDL_CLOSEST TRIG_CLOSEST WBC_CLOSEST HGB_CLOSEST URIC_ACID_CLOSEST HCO3_CLOSEST SODIUM_CLOSEST CREATININE_CLOSEST EF_CLOSEST FEV1_CLOSEST EOS_CLOSEST NEUTRO_CLOSEST MONO_CLOSEST BASOPHIL_CLOSEST K_CLOSEST EGFR_CLOSEST TSH_CLOSEST T4_CLOSEST GLUCOSE_CLOSEST HBA1C_CLOSEST ESR_CLOSEST VITAMIN_D_CLOSEST MAGNESIUM_CLOSEST FOLICAC_CLOSEST VIT_B12_CLOSEST BNP_CLOSEST PLATELET_CLOSEST PA_PRESSURE_CLOSEST HEMATOCRIT_CLOSEST ALBUMIN_CLOSEST PREALBUMIN_CLOSEST MR_CLOSEST TR_CLOSEST MEANPLATELETVOL_CLOSEST MCH_CLOSEST RDW_CLOSEST MCV_CLOSEST MCHC_CLOSEST RBC_CLOSEST LYMPHOCYTE_CLOSEST CA125_CLOSEST BILIRUBIN_CLOSEST ALT_CLOSEST AST_CLOSEST CA_CLOSEST PHOSPHORUS_CLOSEST URINEPROTEIN_CLOSEST TOTALPREVIOUSHOSPVISITS TOTALPREVIOUSEDVISITS TOTALPREVIOUSPCPVISITS PREVIOUSSPECIALTYVISIT PREVIOUSURGENTCAREVISIT CAV_REC_SEX CAV_REC_LANG CAV_REC_AGE CAV_REC_IPOP CAV_REC_PRIORITY_CODE CAV_REC_DISP_CODE UREA_NITROGEN_MAX_1 UREA_NITROGEN_MIN_1 CALCIUM_MAX_1 CALCIUM_MIN_1 IRON_MAX_1 IRON_MIN_1 GLUCOSE_MAX_1 GLUCOSE_MIN_1 HGB_MAX_1 HGB_MIN_1 HEMATOCRIT_MAX_1 HEMATOCRIT_MIN_1 CHLORIDE_MAX_1 CHLORIDE_MIN_1 SODIUM_MAX_1 SODIUM_MIN_1 CREATININE_MAX_1 CREATININE_MIN_1 CARBON_DIOXIDE_MAX_1 CARBON_DIOXIDE_MIN_1 RBC_MAX_1 RBC_MIN_1 MCV_MAX_1 MCV_MIN_1 MCH_MAX_1 MCH_MIN_1 MCHC_MAX_1 MCHC_MIN_1 ANION_GAP_MAX_1 ANION_GAP_MIN_1 PLATELETS_MAX_1 PLATELETS_MIN_1 WBC_MAX_1 WBC_MIN_1 MEAN_PLATELET_VOLUME_MAX_1 MEAN_PLATELET_VOLUME_MIN_1 EGFR_MAX_1 EGFR_MIN_1 RDW_MAX_1 RDW_MIN_1 BASOPHILS_MAX_1 BASOPHILS_MIN_1 NEUTROPHILS_MAX_1 NEUTROPHILS_MIN_1 LYMPHOCYTES_MAX_1 LYMPHOCYTES_MIN_1 MONOCYTES_MAX_1 MONOCYTES_MIN_1 EOSINOPHILS_MAX_1 EOSINOPHILS_MIN_1 MAGNESIUM_MAX_1 MAGNESIUM_MIN_1 PHOSPHORUS_MAX_1 PHOSPHORUS_MIN_1 INR_MAX_1 INR_MIN_1 ALBUMIN_MAX_1 ALBUMIN_MIN_1 TOTAL_BILIRUBIN_MAX_1 TOTAL_BILIRUBIN_MIN_1 AST_MAX_1 AST_MIN_1 ALT_MAX_1 ALT_MIN_1 ALKALINE_PHOSPHATASE_MAX_1 ALKALINE_PHOSPHATASE_MIN_1 TOTAL_PROTEIN_MAX_1 TOTAL_PROTEIN_MIN_1 BUN_CREATININE_RATIO_MAX_1 ACTIVATED_PTT_MAX_1 BUN_CREATININE_RATIO_MIN_1 ACTIVATED_PTT_MIN_1 TROPONIN_I_MAX_1 TROPONIN_I_MIN_1 SPECIFIC_GRAVITY_URINE_MAX_1 SPECIFIC_GRAVITY_URINE_MIN_1 PROTEIN_URINE_MAX_1 PROTEIN_URINE_MIN_1 PH_URINE_MAX_1 PH_URINE_MIN_1 KETONES_URINE_MAX_1 KETONES_URINE_MIN_1 URINE_NITRITE_MAX_1 URINE_NITRITE_MIN_1 LEUKOCYTE_ESTERASE_MAX_1 LEUKOCYTE_ESTERASE_MIN_1 BLOOD_URINE_MAX_1 BLOOD_URINE_MIN_1 BILIRUBIN_URINE_MAX_1 BILIRUBIN_URINE_MIN_1 UROBILINOGEN_URINE_MAX_1 UROBILINOGEN_URINE_MIN_1 WHITE_BLOOD_CELLS_URINE_MAX_1 WHITE_BLOOD_CELLS_URINE_MIN_1 RED_BLOOD_CELLS_URINE_MAX_1 RED_BLOOD_CELLS_URINE_MIN_1 CALCULATED_OSMOLALITY_MAX_1 CALCULATED_OSMOLALITY_MIN_1 DIRECT_BILIRUBIN_MAX_1 DIRECT_BILIRUBIN_MIN_1 LACTATE_BLOOD_MAX_1 LACTATE_BLOOD_MIN_1 BACTERIA_MAX_1 BACTERIA_MIN_1 EPITHELIAL_CELLS_MAX_1 EPITHELIAL_CELLS_MIN_1 AG_RATIO_MAX_1 AG_RATIO_MIN_1 PCO2_ARTERIAL_MAX_1 PCO2_ARTERIAL_MIN_1 ADI_2015 LOS
0 ALTOR White Not Hispanic or Latino 2/19/2018 0:00 ALT 2/13/2018 20:04 2/19/2018 7:00 3904785 0.0 73.3 28.21 3056.0 102.0 70.0 83.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 1.0 0.0 1.0 1.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 169.0 28.0 101.0 201.0 4.7 13.9 NaN 23.0 137.0 1.22 76.0 NaN 1.5 47.0 15.6 0.6 4.2 58.0 NaN NaN 91.0 5.6 NaN NaN 1.6 NaN NaN NaN 145.0 40.0 41.4 3.2 NaN 2.0 2.0 7.6 30.9 16.7 91.9 33.7 4.51 35.3 NaN 0.6 13.0 26.0 9.0 NaN NaN 1.0 NaN NaN 3.0 NaN M ENG 72.0 IP D 6 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 68.493519 7.660417
1 NOROB White Not Hispanic or Latino 2/22/2018 0:00 NOR 1/23/2018 9:27 2/22/2018 8:00 576089753 1.0 26.1 25.61 2240.0 96.0 58.0 NaN 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 NaN NaN NaN NaN 6.2 11.0 NaN NaN NaN NaN NaN NaN 0.4 74.6 6.5 0.3 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 177.0 NaN 32.1 NaN NaN NaN NaN 7.7 31.1 13.2 91.3 34.1 3.52 18.2 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 8.0 NaN F ENG 23.0 IP L 1 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 75.203533 2.575000
2 SHYOR White Declined 3/27/2018 0:00 SHY 3/6/2018 11:02 3/27/2018 7:15 3907499 1.0 19.0 37.76 3520.0 NaN NaN NaN 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 NaN NaN NaN NaN 7.6 10.1 NaN 28.0 139.0 0.62 NaN NaN 0.0 73.0 11.0 0.0 4.2 153.0 NaN NaN 104.0 NaN NaN NaN 2.3 NaN NaN NaN 446.0 NaN 30.2 3.2 NaN NaN NaN 7.0 30.0 14.2 89.3 33.5 3.38 15.0 NaN 0.2 26.0 14.0 9.1 3.5 NaN 2.0 NaN NaN 2.0 NaN F ENG 18.0 IP F 6 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 59.740571 3.552083
3 SHYOR White Not Hispanic or Latino 6/8/2017 0:00 SHY 6/6/2017 14:19 6/8/2017 13:00 3907039 0.0 74.1 29.53 3200.0 NaN NaN NaN 0.0 0.0 1.0 0.0 0.0 0.0 1.0 0.0 1.0 0.0 0.0 1.0 1.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 NaN NaN NaN NaN 11.8 8.7 4.1 17.0 119.0 1.56 60.0 NaN 5.0 NaN 13.0 0.0 5.2 43.0 0.426 NaN 139.0 NaN 11.0 NaN 2.6 24.8 1195.0 441.0 198.0 49.0 25.5 3.5 15.0 2.0 2.0 8.3 29.8 15.7 90.7 32.8 2.80 5.0 NaN 0.3 15.0 13.0 8.5 4.7 NaN 4.0 NaN NaN 8.0 NaN M ENG 73.0 IP F 6 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 44.761703 5.778472
4 EASOR White Not Hispanic or Latino 12/20/2017 0:00 EAS 12/7/2017 13:52 12/20/2017 10:00 3907499 0.0 57.8 26.91 3398.4 120.0 60.0 NaN 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 183.0 66.0 106.0 54.0 7.9 14.3 NaN 30.0 139.0 0.85 NaN NaN 1.0 60.0 9.0 1.0 3.7 59.0 NaN NaN 86.0 NaN NaN NaN NaN NaN NaN NaN 205.0 NaN 41.5 3.8 24.0 NaN NaN 8.1 30.8 13.2 89.5 34.4 4.64 29.0 NaN 1.2 8.0 15.0 8.3 NaN NaN NaN NaN 3.0 4.0 NaN M ENG 51.0 EDTR D D NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 44.718861 1.626389
In [6]:
# Checking the statistics of the numerical variables in the dataset
df.describe()
Out[6]:
SCHED_SURG_PROC_CD FEMALE AGE_ON_CONTACT_DATE BMI WEIGHT BP_SYSTOLIC BP_DIASTOLIC PULSE PCPVISIT METFORMIN_FLAG OPIOIDS_FLAG ALPHA_BLOCKERS CENTRAL_ANTAGONISTS RENIN BETA_BLOCKERS ACE_INHIB ARB ALDOSTERONE_BLOCKERS VASODIALATORS DIURETICS CALCIUM_BLOCKERS STATINS INSULIN_MEDS ASPIRIN WARFARIN DOACS PRETERM_17P MEDROL PREDNISONE INHALED_STEROID_WITH_LABA INHALED_STEROID_WITHOUT_LABA INHALED_STEROIDS ASTHMA_BIOLOGICS SHORT_ACTING_BRONCHO_DIALATORS TNF_INHIBITORS IMMUNOMODULATORS AMINOSALICYLATES CORTICOSTEROIDS ARNI ALLOPURINOL SEIZURE MUSCLERELAXANT DIGOXIN INOTROPES ANTI_ARRHYTHMIC ANTIPLATELET SULFONYLUREA GLP_1_AGONIST THIAZOLIDINEDIONE SGLT2_INHIBITOR DPP4_INHIBITOR ALPHA_GLUCOSIDASE_INHIBITOR AMYLINOMIMETIC RAPID_ACTING_INSULIN SHORT_ACTING_INSULIN INTERMEDIATE_ACTING_INSULIN LONG_ACTING_INSULIN MINOCYCLINE DOXYCYCLINE MELATONIN METHAZOLAMIDE HYDROXYCHLOROQUINE ITTC DMARDS OBESE_HST MORBIDOBESE_HST PH_HST AFIB_HST COPD_HST CHF_HST DIAB_HST CAD_HST OSTEO_HST HTN_HST CANCER_HST LUNG_CANCER_HST OVARIAN_CANCER_HST HEAD_NECK_CANCER_HST BREAST_CANCER_HST ASTHMA_HST GERD_HST FIBROMYALGIA_HST DEPRESSION_HST PSORIATIC_ARTHRITIS_HST RHEUM_ARTHRITIS_HST LUPUS_HST VTVF_HST STROKE_HST VASCULARDISEASE_HST LOWBACKPAIN_HST DVT_HST PE_HST HYPOTHYROIDISM_HST ADRENAL_INSUFFICIENCY_HST INFERTILITY_HST CKD_HST ESRD_HST OBS_SLEEPAPNEA_HST CARDIAC_ARREST_HST HEMO_STROKE_HST MAJOR_BLEED_HST MACULAR_DEGEN_HST ANXIETY_HST HYPERLIPIDEMIA_HST HIV_HST ALZHEIMER_HST COLORECTAL_CANCER_HST ENDOMETRIAL_CANCER_HST GLAUCOMA_HST HIP_PELVIC_FRACTURE_HST BENIGN_PROSTATIC_HYPERPLASIA_HST CIRRHOSIS_HST CIRRHOSIS_HST_1 CHOLESTEROL_CLOSEST HDL_CLOSEST LDL_CLOSEST TRIG_CLOSEST WBC_CLOSEST HGB_CLOSEST URIC_ACID_CLOSEST HCO3_CLOSEST SODIUM_CLOSEST CREATININE_CLOSEST EF_CLOSEST FEV1_CLOSEST EOS_CLOSEST NEUTRO_CLOSEST MONO_CLOSEST BASOPHIL_CLOSEST K_CLOSEST EGFR_CLOSEST TSH_CLOSEST T4_CLOSEST GLUCOSE_CLOSEST HBA1C_CLOSEST ESR_CLOSEST VITAMIN_D_CLOSEST MAGNESIUM_CLOSEST FOLICAC_CLOSEST VIT_B12_CLOSEST BNP_CLOSEST PLATELET_CLOSEST PA_PRESSURE_CLOSEST HEMATOCRIT_CLOSEST ALBUMIN_CLOSEST PREALBUMIN_CLOSEST MR_CLOSEST TR_CLOSEST MEANPLATELETVOL_CLOSEST MCH_CLOSEST RDW_CLOSEST MCV_CLOSEST MCHC_CLOSEST RBC_CLOSEST LYMPHOCYTE_CLOSEST CA125_CLOSEST BILIRUBIN_CLOSEST ALT_CLOSEST AST_CLOSEST CA_CLOSEST PHOSPHORUS_CLOSEST URINEPROTEIN_CLOSEST TOTALPREVIOUSHOSPVISITS TOTALPREVIOUSEDVISITS TOTALPREVIOUSPCPVISITS PREVIOUSSPECIALTYVISIT PREVIOUSURGENTCAREVISIT CAV_REC_AGE UREA_NITROGEN_MAX_1 UREA_NITROGEN_MIN_1 CALCIUM_MAX_1 CALCIUM_MIN_1 IRON_MAX_1 IRON_MIN_1 GLUCOSE_MAX_1 GLUCOSE_MIN_1 HGB_MAX_1 HGB_MIN_1 HEMATOCRIT_MAX_1 HEMATOCRIT_MIN_1 CHLORIDE_MAX_1 CHLORIDE_MIN_1 SODIUM_MAX_1 SODIUM_MIN_1 CREATININE_MAX_1 CREATININE_MIN_1 CARBON_DIOXIDE_MAX_1 CARBON_DIOXIDE_MIN_1 RBC_MAX_1 RBC_MIN_1 MCV_MAX_1 MCV_MIN_1 MCH_MAX_1 MCH_MIN_1 MCHC_MAX_1 MCHC_MIN_1 ANION_GAP_MAX_1 ANION_GAP_MIN_1 PLATELETS_MAX_1 PLATELETS_MIN_1 WBC_MAX_1 WBC_MIN_1 MEAN_PLATELET_VOLUME_MAX_1 MEAN_PLATELET_VOLUME_MIN_1 EGFR_MAX_1 EGFR_MIN_1 RDW_MAX_1 RDW_MIN_1 BASOPHILS_MAX_1 BASOPHILS_MIN_1 NEUTROPHILS_MAX_1 NEUTROPHILS_MIN_1 LYMPHOCYTES_MAX_1 LYMPHOCYTES_MIN_1 MONOCYTES_MAX_1 MONOCYTES_MIN_1 EOSINOPHILS_MAX_1 EOSINOPHILS_MIN_1 MAGNESIUM_MAX_1 MAGNESIUM_MIN_1 PHOSPHORUS_MAX_1 PHOSPHORUS_MIN_1 INR_MAX_1 INR_MIN_1 ALBUMIN_MAX_1 ALBUMIN_MIN_1 TOTAL_BILIRUBIN_MAX_1 TOTAL_BILIRUBIN_MIN_1 AST_MAX_1 AST_MIN_1 ALT_MAX_1 ALT_MIN_1 ALKALINE_PHOSPHATASE_MAX_1 ALKALINE_PHOSPHATASE_MIN_1 TOTAL_PROTEIN_MAX_1 TOTAL_PROTEIN_MIN_1 BUN_CREATININE_RATIO_MAX_1 ACTIVATED_PTT_MAX_1 BUN_CREATININE_RATIO_MIN_1 ACTIVATED_PTT_MIN_1 TROPONIN_I_MAX_1 TROPONIN_I_MIN_1 SPECIFIC_GRAVITY_URINE_MAX_1 SPECIFIC_GRAVITY_URINE_MIN_1 PROTEIN_URINE_MAX_1 PROTEIN_URINE_MIN_1 PH_URINE_MAX_1 PH_URINE_MIN_1 KETONES_URINE_MAX_1 KETONES_URINE_MIN_1 URINE_NITRITE_MAX_1 URINE_NITRITE_MIN_1 LEUKOCYTE_ESTERASE_MAX_1 LEUKOCYTE_ESTERASE_MIN_1 BLOOD_URINE_MAX_1 BLOOD_URINE_MIN_1 BILIRUBIN_URINE_MAX_1 BILIRUBIN_URINE_MIN_1 UROBILINOGEN_URINE_MAX_1 UROBILINOGEN_URINE_MIN_1 WHITE_BLOOD_CELLS_URINE_MAX_1 WHITE_BLOOD_CELLS_URINE_MIN_1 RED_BLOOD_CELLS_URINE_MAX_1 RED_BLOOD_CELLS_URINE_MIN_1 CALCULATED_OSMOLALITY_MAX_1 CALCULATED_OSMOLALITY_MIN_1 DIRECT_BILIRUBIN_MAX_1 DIRECT_BILIRUBIN_MIN_1 LACTATE_BLOOD_MAX_1 LACTATE_BLOOD_MIN_1 BACTERIA_MAX_1 BACTERIA_MIN_1 EPITHELIAL_CELLS_MAX_1 EPITHELIAL_CELLS_MIN_1 AG_RATIO_MAX_1 AG_RATIO_MIN_1 PCO2_ARTERIAL_MAX_1 PCO2_ARTERIAL_MIN_1 ADI_2015 LOS
count 8.000000e+04 69813.000000 69813.000000 60741.000000 61030.000000 58853.000000 58797.000000 45272.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 69813.000000 25639.000000 25397.000000 25099.000000 26500.000000 62960.000000 63088.000000 6339.00000 60484.000000 60778.000000 61215.000000 29461.000000 5426.000000 56432.000000 53207.000000 56653.000000 56359.000000 60485.000000 58152.000000 28202.000000 10447.000000 61137.000000 22821.000000 3319.000000 12845.000000 35349.000000 9307.000000 14335.000000 10701.000000 62565.000000 14892.000000 63070.000000 52622.000000 12545.000000 11712.000000 13511.000000 61326.000000 62624.000000 58952.000000 62713.000000 62625.000000 62687.000000 56636.000000 1266.000000 51353.000000 51913.000000 52164.000000 59994.000000 27795.000000 13038.000000 29278.000000 22810.000000 30399.000000 57324.000000 4991.000000 61105.000000 4969.000000 4969.000000 4875.000000 4875.000000 1068.000000 1068.000000 4904.000000 4904.000000 5056.000000 5056.000000 5059.000000 5059.000000 4958.000000 4958.000000 4959.000000 4959.000000 4974.000000 4974.000000 4958.000000 4958.000000 5013.000000 5013.000000 5016.000000 5016.000000 5013.000000 5013.000000 5013.000000 5013.000000 4893.000000 4893.00000 5014.000000 5014.000000 4998.000000 4998.000000 4991.000000 4991.000000 2677.000000 2677.000000 4574.000000 4574.000000 4335.000000 4335.000000 4338.000000 4338.000000 4322.000000 4322.000000 4358.000000 4358.000000 4341.000000 4341.000000 3524.000000 3524.000000 2807.000000 2807.000000 3764.000000 3764.000000 3852.000000 3852.000000 3727.000000 3727.000000 3722.000000 3722.000000 3722.000000 3722.000000 3723.000000 3723.000000 3720.000000 3720.000000 2217.000000 2972.000000 2217.000000 2972.000000 1161.000000 1161.000000 3325.000000 3325.000000 2807.000000 2807.000000 3413.000000 3413.000000 363.000000 363.000000 1.0 1.0 117.000000 117.000000 134.000000 134.000000 156.000000 156.000000 1593.000000 1593.000000 2166.000000 2166.000000 2121.000000 2121.000000 1059.000000 1059.000000 2166.000000 2166.000000 0.0 0.0 6.0 6.0 1732.000000 1732.000000 1857.000000 1857.000000 1179.000000 1179.000000 74401.000000 80000.000000
mean 1.267181e+08 0.581969 58.550577 30.432342 3039.820547 128.384178 76.327466 78.685965 0.216435 0.116726 0.327274 0.016358 0.000587 0.000387 0.295819 0.210964 0.123186 0.030453 0.019796 0.262888 0.190423 0.366694 0.105339 0.337430 0.057425 0.056694 0.001318 0.033389 0.074857 0.163265 0.134230 0.177188 0.000673 0.180998 0.005114 0.016473 0.009798 0.163394 0.001547 0.033418 0.012576 0.029307 0.013321 0.000358 0.024036 0.068239 0.053357 0.012433 0.005715 0.007735 0.027273 0.000645 0.000229 0.070159 0.010743 0.012634 0.075659 0.002507 0.020913 0.027416 0.000115 0.011617 0.000559 0.036512 0.572716 0.171587 0.008079 0.083781 0.132927 0.079570 0.191641 0.143383 0.062195 0.418633 0.195093 0.015040 0.005586 0.001289 0.033934 0.233653 0.238494 0.027803 0.156232 0.020612 0.025783 0.009225 0.008537 0.077034 0.048916 0.279318 0.034378 0.023062 0.138126 0.024179 0.011431 0.071247 0.027531 0.120293 0.001404 0.005916 0.163766 0.010041 0.163222 0.339321 0.002349 0.015427 0.025239 0.008007 0.028977 0.008838 0.078295 0.020784 0.020784 168.129635 48.905154 91.867170 144.156630 8.672291 12.768454 5.71759 26.225986 138.777418 1.230113 54.700223 80.553832 2.287022 70.200648 8.177316 0.643683 4.101149 66.774032 3.576160 1.176726 118.361217 6.548360 29.535194 30.070265 1.975396 16.901254 727.908664 488.197511 243.355558 33.864662 38.115252 3.644293 18.244296 1.684725 1.683739 8.580740 29.801747 14.827186 89.249349 33.483293 4.401320 22.146395 173.487362 0.662101 30.142404 29.684840 9.138407 3.411448 0.906351 2.526163 2.251381 4.222343 5.345702 1.473653 56.960871 31.066009 10.959952 9.349333 8.049538 58.195693 38.534644 195.594413 91.761011 13.517544 9.950040 40.542479 29.684938 108.868495 99.556474 141.899355 134.165961 2.136986 0.867407 29.497781 22.067205 4.534183 3.365029 92.597807 86.071212 31.204708 28.511530 34.324636 32.473549 15.260004 8.79438 320.430594 170.740925 14.297559 6.322029 9.092707 7.802925 56.806126 36.316773 16.680695 13.965654 1.135940 0.262053 80.215906 58.087022 27.792226 11.207404 11.556104 5.411358 4.533633 0.812416 2.304058 1.650341 4.446812 2.473637 1.725909 1.074649 3.614772 2.824439 1.274961 0.489268 90.799839 23.143471 90.799839 23.143471 142.619662 75.895246 7.353898 6.163871 24.041132 49.272611 12.343442 29.783748 3.733394 0.699303 1.023009 1.014346 4.446812 2.473637 6.501318 5.708909 14.517906 10.977961 2.0 2.0 174.358974 40.384615 4.841940 4.565522 0.256410 0.185897 1.126303 0.655242 84.462004 15.940305 96.388967 14.173032 298.615675 285.474032 0.714668 0.243167 NaN NaN 0.0 0.0 4.274827 1.575058 0.956489 0.780183 47.490670 36.324343 61.167898 4.172534
std 4.004017e+08 0.493239 18.155244 7.674459 854.929967 18.050635 10.962220 14.608239 0.411817 0.321096 0.469222 0.126849 0.024227 0.019662 0.456413 0.407996 0.328653 0.171831 0.139299 0.440205 0.392638 0.481905 0.306991 0.472836 0.232654 0.231259 0.036278 0.179652 0.263163 0.369610 0.340902 0.381830 0.025938 0.385019 0.071327 0.127285 0.098497 0.369727 0.039302 0.179726 0.111438 0.168666 0.114647 0.018920 0.153161 0.252158 0.224746 0.110810 0.075384 0.087608 0.162879 0.025381 0.015137 0.255416 0.103091 0.111688 0.264454 0.050004 0.143094 0.163294 0.010704 0.107154 0.023629 0.187561 0.494688 0.377024 0.089519 0.277061 0.339498 0.270628 0.393595 0.350466 0.241511 0.493338 0.396275 0.121714 0.074533 0.035882 0.181059 0.423157 0.426166 0.164409 0.363077 0.142083 0.158489 0.095602 0.092002 0.266648 0.215695 0.448667 0.182198 0.150100 0.345035 0.153605 0.106302 0.257240 0.163625 0.325306 0.037441 0.076687 0.370066 0.099702 0.369571 0.473482 0.048411 0.123244 0.156851 0.089124 0.167744 0.093594 0.268637 0.142662 0.142662 49.566022 17.655999 39.859074 151.831518 6.023273 2.891663 2.29530 3.469254 52.110247 3.363239 10.171340 25.405989 4.712806 99.247564 10.291154 0.815246 0.926366 587.062491 143.184916 1.190481 53.717548 4.391626 28.520555 17.802718 0.429236 9.907728 542.810658 818.005567 311.657490 64.742666 6.001553 1.306823 8.765387 0.742687 0.738701 4.492555 2.656297 2.416379 36.482576 13.646839 15.318758 39.576597 1005.593571 1.026320 72.288242 109.317412 0.728784 2.004037 1.299295 2.906964 3.253671 3.494604 5.756465 0.986714 18.611634 25.007432 7.370036 0.649681 0.811520 49.757369 35.151483 126.606407 31.160339 1.869840 2.635568 5.450195 7.681986 4.949213 5.506203 3.860584 4.719319 5.316868 0.711211 3.784207 4.299299 0.645576 0.865980 7.126757 7.485839 2.684648 3.056097 0.992171 1.228935 4.008567 2.79775 148.153603 75.765158 10.234851 2.982428 1.408825 0.944012 21.133715 20.972715 3.749130 1.718625 0.987579 0.390463 11.214114 14.134943 12.386906 8.756639 4.764126 3.195163 3.953264 1.373589 0.693867 0.340963 1.610438 0.883718 1.350755 0.207141 0.611985 0.732565 2.215754 0.637902 281.032139 31.189159 281.032139 31.189159 145.730767 44.248374 0.906815 1.047984 11.501974 36.269423 5.906369 7.834436 12.382099 3.733062 0.011590 0.008460 1.610438 0.883718 0.894287 0.715552 17.635868 14.396499 NaN NaN 209.099608 112.899012 30.889343 30.927423 0.663537 0.539468 1.436956 0.920870 530.734454 85.787278 871.617343 93.553001 10.951715 8.301959 1.799644 0.723216 NaN NaN 0.0 0.0 7.340965 3.670203 0.290348 0.261699 13.987295 8.648845 17.520428 5.889182
min 3.903406e+06 0.000000 0.200000 10.630000 174.960000 48.000000 2.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 30.000000 2.800000 0.000000 10.000000 0.300000 1.800000 0.20000 0.250000 3.800000 0.090000 0.000000 9.000000 0.000000 0.000000 0.000000 0.000000 1.800000 2.000000 0.005000 0.100000 0.400000 1.000000 0.080000 0.250000 0.400000 1.000000 50.000000 0.000000 1.000000 0.000000 4.000000 0.300000 1.000000 1.000000 1.000000 2.800000 12.900000 1.800000 30.600000 18.300000 0.890000 0.000000 1.500000 0.000000 2.000000 3.000000 4.000000 0.200000 0.000000 1.000000 1.000000 1.000000 1.000000 1.000000 0.000000 1.000000 0.000000 7.100000 3.300000 6.000000 6.000000 55.000000 13.000000 7.500000 3.200000 21.000000 9.700000 89.000000 65.000000 126.000000 104.000000 0.290000 0.120000 15.000000 2.000000 1.530000 0.720000 60.500000 11.600000 19.500000 16.300000 29.800000 25.000000 2.000000 0.00000 23.000000 3.000000 1.200000 0.000000 5.900000 4.900000 4.000000 0.000000 11.900000 11.100000 0.000000 0.000000 12.000000 0.000000 0.700000 0.000000 0.000000 0.000000 0.000000 0.000000 0.900000 0.300000 1.400000 0.100000 0.800000 0.800000 1.400000 0.700000 0.100000 0.000000 3.000000 3.000000 3.000000 3.000000 12.000000 11.000000 2.400000 2.400000 1.300000 19.300000 0.000000 12.900000 0.010000 0.010000 1.001000 1.000000 1.400000 0.100000 5.000000 1.006000 0.000000 0.000000 2.0 2.0 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 263.000000 242.000000 0.000000 0.000000 NaN NaN 0.0 0.0 0.000000 0.000000 0.200000 0.200000 0.000000 0.000000 -1.000000 -226.395139
25% 3.904778e+06 0.000000 46.400000 25.090000 2448.000000 118.000000 70.000000 68.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 135.000000 37.000000 64.000000 85.000000 6.200000 11.500000 4.20000 24.000000 137.000000 0.740000 55.000000 64.000000 1.000000 60.000000 6.000000 0.200000 3.800000 59.000000 1.007000 0.980000 91.000000 5.500000 8.000000 17.000000 1.800000 10.600000 371.000000 64.000000 187.000000 24.000000 34.600000 3.300000 11.000000 1.000000 1.000000 7.800000 28.500000 13.400000 85.700000 32.900000 3.900000 14.000000 10.625000 0.400000 14.000000 15.000000 8.800000 2.800000 0.000000 1.000000 1.000000 2.000000 2.000000 1.000000 44.000000 16.000000 6.000000 8.900000 7.600000 26.000000 18.000000 123.000000 76.000000 12.300000 7.700000 37.000000 23.000000 106.000000 97.000000 140.000000 131.000000 0.900000 0.550000 27.000000 20.000000 4.100000 2.660000 88.600000 82.300000 29.800000 27.000000 33.700000 31.800000 13.000000 7.00000 221.000000 120.000000 9.500000 4.500000 8.300000 7.100000 52.000000 20.000000 14.000000 13.000000 0.800000 0.000000 73.900000 50.000000 19.000000 5.000000 8.700000 3.000000 2.000000 0.000000 2.000000 1.400000 3.400000 1.800000 1.100000 1.000000 3.200000 2.300000 0.500000 0.280000 22.000000 11.000000 22.000000 11.000000 76.000000 53.000000 6.800000 5.475000 16.000000 29.900000 8.000000 26.000000 0.060000 0.035000 1.016000 1.009000 3.400000 1.800000 6.000000 5.000000 0.000000 0.000000 2.0 2.0 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.200000 0.200000 2.000000 1.000000 3.000000 1.000000 292.000000 281.000000 0.100000 0.100000 NaN NaN 0.0 0.0 1.000000 0.000000 0.800000 0.600000 39.000000 31.000000 51.732329 1.581250
50% 3.907224e+06 1.000000 61.300000 29.260000 2936.000000 126.000000 78.000000 77.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 165.000000 46.000000 88.000000 118.000000 7.800000 13.000000 5.40000 26.000000 139.000000 0.900000 55.000000 81.000000 2.000000 68.600000 8.000000 0.700000 4.100000 60.000000 1.700000 1.100000 103.000000 6.000000 19.000000 29.000000 2.000000 15.900000 556.000000 194.000000 231.000000 31.000000 38.800000 3.700000 18.000000 2.000000 2.000000 8.400000 30.000000 14.200000 89.400000 33.500000 4.360000 21.000000 21.850000 0.500000 20.000000 19.000000 9.200000 3.300000 0.000000 2.000000 1.000000 3.000000 4.000000 1.000000 60.000000 22.000000 10.000000 9.300000 8.100000 43.500000 27.000000 154.000000 87.000000 13.600000 9.800000 40.800000 29.300000 108.000000 100.000000 142.000000 135.000000 1.120000 0.700000 29.000000 22.700000 4.550000 3.330000 92.200000 86.900000 31.200000 28.900000 34.400000 32.500000 15.000000 9.00000 287.000000 166.000000 12.600000 5.900000 9.000000 7.700000 57.000000 38.000000 15.400000 13.600000 1.000000 0.000000 82.100000 59.000000 26.100000 9.000000 11.000000 5.000000 3.600000 0.100000 2.200000 1.700000 4.100000 2.400000 1.200000 1.000000 3.700000 2.800000 0.700000 0.400000 35.000000 17.000000 35.000000 17.000000 100.000000 68.000000 7.400000 6.200000 22.000000 34.600000 12.000000 28.600000 0.209000 0.060000 1.021000 1.013000 4.100000 2.400000 6.500000 5.500000 15.000000 15.000000 2.0 2.0 25.000000 0.000000 0.200000 0.000000 0.000000 0.000000 1.000000 0.200000 6.000000 2.000000 6.000000 2.000000 297.000000 286.000000 0.200000 0.100000 NaN NaN 0.0 0.0 2.000000 1.000000 0.900000 0.800000 45.000000 36.000000 64.225397 2.654861
75% 5.924212e+07 1.000000 71.700000 34.500000 3520.000000 139.000000 82.000000 87.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 1.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 198.000000 58.000000 115.000000 169.000000 10.200000 14.200000 6.80000 28.000000 141.000000 1.120000 60.000000 98.000000 3.000000 77.000000 9.600000 1.000000 4.400000 60.000000 2.770000 1.300000 126.000000 7.000000 42.000000 40.000000 2.200000 21.700000 921.000000 545.000000 284.000000 40.000000 42.300000 4.100000 24.900000 2.000000 2.000000 9.100000 31.400000 15.400000 93.000000 34.100000 4.750000 28.000000 72.950000 0.700000 30.000000 26.000000 9.500000 3.900000 2.000000 3.000000 2.000000 5.000000 7.000000 2.000000 71.000000 36.000000 14.000000 9.700000 8.600000 71.000000 46.000000 219.000000 101.000000 14.800000 11.900000 44.100000 35.200000 112.000000 103.000000 144.000000 137.000000 1.700000 0.920000 31.675000 25.000000 4.950000 4.010000 96.225000 90.700000 32.600000 30.500000 34.900000 33.300000 17.000000 10.60000 378.000000 212.000000 16.900000 7.800000 9.800000 8.400000 59.000000 51.000000 18.200000 14.400000 1.100000 0.400000 89.000000 67.600000 35.000000 15.400000 13.800000 7.400000 6.000000 1.000000 2.500000 1.900000 5.000000 3.000000 1.700000 1.100000 4.000000 3.400000 1.200000 0.500000 65.000000 25.000000 65.000000 25.000000 147.000000 88.000000 7.900000 6.900000 29.000000 47.700000 15.400000 31.900000 1.370000 0.170000 1.026000 1.019000 5.000000 3.000000 7.000000 6.000000 15.000000 15.000000 2.0 2.0 500.000000 0.000000 1.000000 0.030000 0.000000 0.000000 1.000000 1.000000 31.000000 5.000000 22.000000 5.000000 303.000000 291.000000 0.400000 0.200000 NaN NaN 0.0 0.0 5.000000 2.000000 1.100000 1.000000 52.000000 40.000000 72.639728 4.642361
max 3.693088e+09 1.000000 104.700000 89.860000 22736.000000 249.000000 159.000000 195.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1654.000000 212.000000 520.000000 10618.000000 802.000000 312.800000 52.50000 210.000000 12917.000000 227.800000 91.700000 210.000000 520.000000 8570.000000 960.000000 80.000000 141.000000 99999.000000 23926.000000 113.000000 1448.000000 606.000000 140.000000 181.000000 9.500000 100.000000 6000.000000 12135.000000 75000.000000 7747.000000 143.300000 205.000000 62.200000 12.000000 4.000000 436.000000 83.700000 108.900000 9037.000000 3413.100000 3050.000000 3980.000000 25515.000000 49.800000 5836.000000 12818.000000 94.000000 290.000000 5.000000 188.000000 164.000000 69.000000 273.000000 15.000000 104.000000 201.000000 94.000000 17.500000 11.100000 462.000000 365.000000 2154.000000 468.000000 21.800000 17.800000 66.100000 57.300000 134.000000 115.000000 171.000000 145.000000 170.500000 10.900000 48.000000 38.000000 8.060000 6.760000 162.700000 134.300000 51.500000 47.000000 43.400000 39.800000 57.000000 23.00000 1312.000000 740.000000 281.500000 46.800000 63.900000 12.300000 215.000000 127.000000 39.700000 34.200000 34.000000 4.300000 99.000000 97.500000 100.000000 79.000000 59.000000 30.000000 42.000000 18.000000 24.700000 6.400000 16.000000 8.000000 17.100000 4.500000 6.000000 4.900000 35.100000 14.800000 4952.000000 707.000000 4952.000000 707.000000 1949.000000 804.000000 11.200000 9.600000 129.000000 198.000000 40.000000 179.000000 194.500000 64.600000 1.102000 1.083000 16.000000 8.000000 9.000000 9.000000 100.000000 80.000000 2.0 2.0 500.000000 500.000000 250.000000 250.000000 4.000000 4.000000 12.000000 8.000000 12949.000000 2808.000000 27309.000000 3257.000000 350.000000 316.000000 26.100000 17.840000 NaN NaN 0.0 0.0 141.000000 93.000000 3.900000 3.100000 130.000000 84.800000 100.000000 409.658333
In [7]:
# Checking various columns of the dataset
df.columns
Out[7]:
Index(['SCHED_SURG_AREA', 'RACE', 'ETHNIC_GROUP', 'PROC_DATE',
       'SCHED_HOSPITAL', 'CREATE_DT_TM', 'SCHED_START_DT_TM',
       'SCHED_SURG_PROC_CD', 'FEMALE', 'AGE_ON_CONTACT_DATE',
       ...
       'BACTERIA_MAX_1', 'BACTERIA_MIN_1', 'EPITHELIAL_CELLS_MAX_1',
       'EPITHELIAL_CELLS_MIN_1', 'AG_RATIO_MAX_1', 'AG_RATIO_MIN_1',
       'PCO2_ARTERIAL_MAX_1', 'PCO2_ARTERIAL_MIN_1', 'ADI_2015', 'LOS'],
      dtype='object', length=292)
In [8]:
# Checking the datatypes of the columns
df.dtypes
Out[8]:
SCHED_SURG_AREA                      object
RACE                                 object
ETHNIC_GROUP                         object
PROC_DATE                            object
SCHED_HOSPITAL                       object
CREATE_DT_TM                         object
SCHED_START_DT_TM                    object
SCHED_SURG_PROC_CD                    int64
FEMALE                              float64
AGE_ON_CONTACT_DATE                 float64
BMI                                 float64
WEIGHT                              float64
BP_SYSTOLIC                         float64
BP_DIASTOLIC                        float64
PULSE                               float64
PCPVISIT                            float64
METFORMIN_FLAG                      float64
OPIOIDS_FLAG                        float64
ALPHA_BLOCKERS                      float64
CENTRAL_ANTAGONISTS                 float64
RENIN                               float64
BETA_BLOCKERS                       float64
ACE_INHIB                           float64
ARB                                 float64
ALDOSTERONE_BLOCKERS                float64
VASODIALATORS                       float64
DIURETICS                           float64
CALCIUM_BLOCKERS                    float64
STATINS                             float64
INSULIN_MEDS                        float64
ASPIRIN                             float64
WARFARIN                            float64
DOACS                               float64
PRETERM_17P                         float64
MEDROL                              float64
PREDNISONE                          float64
INHALED_STEROID_WITH_LABA           float64
INHALED_STEROID_WITHOUT_LABA        float64
INHALED_STEROIDS                    float64
ASTHMA_BIOLOGICS                    float64
SHORT_ACTING_BRONCHO_DIALATORS      float64
TNF_INHIBITORS                      float64
IMMUNOMODULATORS                    float64
AMINOSALICYLATES                    float64
CORTICOSTEROIDS                     float64
ARNI                                float64
ALLOPURINOL                         float64
SEIZURE                             float64
MUSCLERELAXANT                      float64
DIGOXIN                             float64
INOTROPES                           float64
ANTI_ARRHYTHMIC                     float64
ANTIPLATELET                        float64
SULFONYLUREA                        float64
GLP_1_AGONIST                       float64
THIAZOLIDINEDIONE                   float64
SGLT2_INHIBITOR                     float64
DPP4_INHIBITOR                      float64
ALPHA_GLUCOSIDASE_INHIBITOR         float64
AMYLINOMIMETIC                      float64
RAPID_ACTING_INSULIN                float64
SHORT_ACTING_INSULIN                float64
INTERMEDIATE_ACTING_INSULIN         float64
LONG_ACTING_INSULIN                 float64
MINOCYCLINE                         float64
DOXYCYCLINE                         float64
MELATONIN                           float64
METHAZOLAMIDE                       float64
HYDROXYCHLOROQUINE                  float64
ITTC                                float64
DMARDS                              float64
OBESE_HST                           float64
MORBIDOBESE_HST                     float64
PH_HST                              float64
AFIB_HST                            float64
COPD_HST                            float64
CHF_HST                             float64
DIAB_HST                            float64
CAD_HST                             float64
OSTEO_HST                           float64
HTN_HST                             float64
CANCER_HST                          float64
LUNG_CANCER_HST                     float64
OVARIAN_CANCER_HST                  float64
HEAD_NECK_CANCER_HST                float64
BREAST_CANCER_HST                   float64
ASTHMA_HST                          float64
GERD_HST                            float64
FIBROMYALGIA_HST                    float64
DEPRESSION_HST                      float64
PSORIATIC_ARTHRITIS_HST             float64
RHEUM_ARTHRITIS_HST                 float64
LUPUS_HST                           float64
VTVF_HST                            float64
STROKE_HST                          float64
VASCULARDISEASE_HST                 float64
LOWBACKPAIN_HST                     float64
DVT_HST                             float64
PE_HST                              float64
HYPOTHYROIDISM_HST                  float64
ADRENAL_INSUFFICIENCY_HST           float64
INFERTILITY_HST                     float64
CKD_HST                             float64
ESRD_HST                            float64
OBS_SLEEPAPNEA_HST                  float64
CARDIAC_ARREST_HST                  float64
HEMO_STROKE_HST                     float64
MAJOR_BLEED_HST                     float64
MACULAR_DEGEN_HST                   float64
ANXIETY_HST                         float64
HYPERLIPIDEMIA_HST                  float64
HIV_HST                             float64
ALZHEIMER_HST                       float64
COLORECTAL_CANCER_HST               float64
ENDOMETRIAL_CANCER_HST              float64
GLAUCOMA_HST                        float64
HIP_PELVIC_FRACTURE_HST             float64
BENIGN_PROSTATIC_HYPERPLASIA_HST    float64
CIRRHOSIS_HST                       float64
CIRRHOSIS_HST_1                     float64
CHOLESTEROL_CLOSEST                 float64
HDL_CLOSEST                         float64
LDL_CLOSEST                         float64
TRIG_CLOSEST                        float64
WBC_CLOSEST                         float64
HGB_CLOSEST                         float64
URIC_ACID_CLOSEST                   float64
HCO3_CLOSEST                        float64
SODIUM_CLOSEST                      float64
CREATININE_CLOSEST                  float64
EF_CLOSEST                          float64
FEV1_CLOSEST                        float64
EOS_CLOSEST                         float64
NEUTRO_CLOSEST                      float64
MONO_CLOSEST                        float64
BASOPHIL_CLOSEST                    float64
K_CLOSEST                           float64
EGFR_CLOSEST                        float64
TSH_CLOSEST                         float64
T4_CLOSEST                          float64
GLUCOSE_CLOSEST                     float64
HBA1C_CLOSEST                       float64
ESR_CLOSEST                         float64
VITAMIN_D_CLOSEST                   float64
MAGNESIUM_CLOSEST                   float64
FOLICAC_CLOSEST                     float64
VIT_B12_CLOSEST                     float64
BNP_CLOSEST                         float64
PLATELET_CLOSEST                    float64
PA_PRESSURE_CLOSEST                 float64
HEMATOCRIT_CLOSEST                  float64
ALBUMIN_CLOSEST                     float64
PREALBUMIN_CLOSEST                  float64
MR_CLOSEST                          float64
TR_CLOSEST                          float64
MEANPLATELETVOL_CLOSEST             float64
MCH_CLOSEST                         float64
RDW_CLOSEST                         float64
MCV_CLOSEST                         float64
MCHC_CLOSEST                        float64
RBC_CLOSEST                         float64
LYMPHOCYTE_CLOSEST                  float64
CA125_CLOSEST                       float64
BILIRUBIN_CLOSEST                   float64
ALT_CLOSEST                         float64
AST_CLOSEST                         float64
CA_CLOSEST                          float64
PHOSPHORUS_CLOSEST                  float64
URINEPROTEIN_CLOSEST                float64
TOTALPREVIOUSHOSPVISITS             float64
TOTALPREVIOUSEDVISITS               float64
TOTALPREVIOUSPCPVISITS              float64
PREVIOUSSPECIALTYVISIT              float64
PREVIOUSURGENTCAREVISIT             float64
CAV_REC_SEX                          object
CAV_REC_LANG                         object
CAV_REC_AGE                         float64
CAV_REC_IPOP                         object
CAV_REC_PRIORITY_CODE                object
CAV_REC_DISP_CODE                    object
UREA_NITROGEN_MAX_1                 float64
UREA_NITROGEN_MIN_1                 float64
CALCIUM_MAX_1                       float64
CALCIUM_MIN_1                       float64
IRON_MAX_1                          float64
IRON_MIN_1                          float64
GLUCOSE_MAX_1                       float64
GLUCOSE_MIN_1                       float64
HGB_MAX_1                           float64
HGB_MIN_1                           float64
HEMATOCRIT_MAX_1                    float64
HEMATOCRIT_MIN_1                    float64
CHLORIDE_MAX_1                      float64
CHLORIDE_MIN_1                      float64
SODIUM_MAX_1                        float64
SODIUM_MIN_1                        float64
CREATININE_MAX_1                    float64
CREATININE_MIN_1                    float64
CARBON_DIOXIDE_MAX_1                float64
CARBON_DIOXIDE_MIN_1                float64
RBC_MAX_1                           float64
RBC_MIN_1                           float64
MCV_MAX_1                           float64
MCV_MIN_1                           float64
MCH_MAX_1                           float64
MCH_MIN_1                           float64
MCHC_MAX_1                          float64
MCHC_MIN_1                          float64
ANION_GAP_MAX_1                     float64
ANION_GAP_MIN_1                     float64
PLATELETS_MAX_1                     float64
PLATELETS_MIN_1                     float64
WBC_MAX_1                           float64
WBC_MIN_1                           float64
MEAN_PLATELET_VOLUME_MAX_1          float64
MEAN_PLATELET_VOLUME_MIN_1          float64
EGFR_MAX_1                          float64
EGFR_MIN_1                          float64
RDW_MAX_1                           float64
RDW_MIN_1                           float64
BASOPHILS_MAX_1                     float64
BASOPHILS_MIN_1                     float64
NEUTROPHILS_MAX_1                   float64
NEUTROPHILS_MIN_1                   float64
LYMPHOCYTES_MAX_1                   float64
LYMPHOCYTES_MIN_1                   float64
MONOCYTES_MAX_1                     float64
MONOCYTES_MIN_1                     float64
EOSINOPHILS_MAX_1                   float64
EOSINOPHILS_MIN_1                   float64
MAGNESIUM_MAX_1                     float64
MAGNESIUM_MIN_1                     float64
PHOSPHORUS_MAX_1                    float64
PHOSPHORUS_MIN_1                    float64
INR_MAX_1                           float64
INR_MIN_1                           float64
ALBUMIN_MAX_1                       float64
ALBUMIN_MIN_1                       float64
TOTAL_BILIRUBIN_MAX_1               float64
TOTAL_BILIRUBIN_MIN_1               float64
AST_MAX_1                           float64
AST_MIN_1                           float64
ALT_MAX_1                           float64
ALT_MIN_1                           float64
ALKALINE_PHOSPHATASE_MAX_1          float64
ALKALINE_PHOSPHATASE_MIN_1          float64
TOTAL_PROTEIN_MAX_1                 float64
TOTAL_PROTEIN_MIN_1                 float64
BUN_CREATININE_RATIO_MAX_1          float64
ACTIVATED_PTT_MAX_1                 float64
BUN_CREATININE_RATIO_MIN_1          float64
ACTIVATED_PTT_MIN_1                 float64
TROPONIN_I_MAX_1                    float64
TROPONIN_I_MIN_1                    float64
SPECIFIC_GRAVITY_URINE_MAX_1        float64
SPECIFIC_GRAVITY_URINE_MIN_1        float64
PROTEIN_URINE_MAX_1                 float64
PROTEIN_URINE_MIN_1                 float64
PH_URINE_MAX_1                      float64
PH_URINE_MIN_1                      float64
KETONES_URINE_MAX_1                 float64
KETONES_URINE_MIN_1                 float64
URINE_NITRITE_MAX_1                 float64
URINE_NITRITE_MIN_1                 float64
LEUKOCYTE_ESTERASE_MAX_1            float64
LEUKOCYTE_ESTERASE_MIN_1            float64
BLOOD_URINE_MAX_1                   float64
BLOOD_URINE_MIN_1                   float64
BILIRUBIN_URINE_MAX_1               float64
BILIRUBIN_URINE_MIN_1               float64
UROBILINOGEN_URINE_MAX_1            float64
UROBILINOGEN_URINE_MIN_1            float64
WHITE_BLOOD_CELLS_URINE_MAX_1       float64
WHITE_BLOOD_CELLS_URINE_MIN_1       float64
RED_BLOOD_CELLS_URINE_MAX_1         float64
RED_BLOOD_CELLS_URINE_MIN_1         float64
CALCULATED_OSMOLALITY_MAX_1         float64
CALCULATED_OSMOLALITY_MIN_1         float64
DIRECT_BILIRUBIN_MAX_1              float64
DIRECT_BILIRUBIN_MIN_1              float64
LACTATE_BLOOD_MAX_1                 float64
LACTATE_BLOOD_MIN_1                 float64
BACTERIA_MAX_1                      float64
BACTERIA_MIN_1                      float64
EPITHELIAL_CELLS_MAX_1              float64
EPITHELIAL_CELLS_MIN_1              float64
AG_RATIO_MAX_1                      float64
AG_RATIO_MIN_1                      float64
PCO2_ARTERIAL_MAX_1                 float64
PCO2_ARTERIAL_MIN_1                 float64
ADI_2015                            float64
LOS                                 float64
dtype: object
In [9]:
# Making a copy of the dataset
train_df = df.copy()
In [10]:
# Importing LabelEncoder (for transforming non-numerical variables to numeric variables)
le = preprocessing.LabelEncoder()
In [11]:
# Label Encoding SCHED_SURG_AREA
le.fit(train_df['SCHED_SURG_AREA'])
train_df['SCHED_SURG_AREA'] = le.transform(train_df['SCHED_SURG_AREA'])
In [12]:
# Label Encoding RACE
train_df['RACE'] = le.fit_transform(train_df['RACE'].apply(str))
In [13]:
# Label Encoding ETHNIC_GROUP
train_df['ETHNIC_GROUP'] = le.fit_transform(train_df['ETHNIC_GROUP'].apply(str))
In [14]:
# Label Encoding SCHED_HOSPITAL
train_df['SCHED_HOSPITAL'] = le.fit_transform(train_df['SCHED_HOSPITAL'].apply(str))
In [15]:
# Label Encoding SCHED_SURG_PROC_CD
train_df['SCHED_SURG_PROC_CD'] = le.fit_transform(train_df['SCHED_SURG_PROC_CD'].apply(str))
In [16]:
# Label Encoding FEMALE
train_df['FEMALE'] = le.fit_transform(train_df['FEMALE'].apply(str))
In [17]:
# Label Encoding CAV_REC_SEX
train_df['CAV_REC_SEX'] = le.fit_transform(train_df['CAV_REC_SEX'].apply(str))
In [18]:
# Label Encoding CAV_REC_LANG
train_df['CAV_REC_LANG'] = le.fit_transform(train_df['CAV_REC_LANG'].apply(str))
In [19]:
# Label Encoding CAV_REC_IPOP
train_df['CAV_REC_IPOP'] = le.fit_transform(train_df['CAV_REC_IPOP'].apply(str))
In [20]:
# Label Encoding CAV_REC_PRIORITY_CODE
train_df['CAV_REC_PRIORITY_CODE'] = le.fit_transform(train_df['CAV_REC_PRIORITY_CODE'].apply(str))
In [21]:
# Label Encoding CAV_REC_DISP_CODE
train_df['CAV_REC_DISP_CODE'] = le.fit_transform(train_df['CAV_REC_DISP_CODE'].apply(str))
In [22]:
# Label Encoding LOS_Binary
train_df['LOS_Binary'] = train_df.apply(lambda x: 1 if (x['LOS'] > 5) else 0,axis=1)
In [23]:
# Checking sample values (after Label Encoding)
train_df.head()
Out[23]:
SCHED_SURG_AREA RACE ETHNIC_GROUP PROC_DATE SCHED_HOSPITAL CREATE_DT_TM SCHED_START_DT_TM SCHED_SURG_PROC_CD FEMALE AGE_ON_CONTACT_DATE BMI WEIGHT BP_SYSTOLIC BP_DIASTOLIC PULSE PCPVISIT METFORMIN_FLAG OPIOIDS_FLAG ALPHA_BLOCKERS CENTRAL_ANTAGONISTS RENIN BETA_BLOCKERS ACE_INHIB ARB ALDOSTERONE_BLOCKERS VASODIALATORS DIURETICS CALCIUM_BLOCKERS STATINS INSULIN_MEDS ASPIRIN WARFARIN DOACS PRETERM_17P MEDROL PREDNISONE INHALED_STEROID_WITH_LABA INHALED_STEROID_WITHOUT_LABA INHALED_STEROIDS ASTHMA_BIOLOGICS SHORT_ACTING_BRONCHO_DIALATORS TNF_INHIBITORS IMMUNOMODULATORS AMINOSALICYLATES CORTICOSTEROIDS ARNI ALLOPURINOL SEIZURE MUSCLERELAXANT DIGOXIN INOTROPES ANTI_ARRHYTHMIC ANTIPLATELET SULFONYLUREA GLP_1_AGONIST THIAZOLIDINEDIONE SGLT2_INHIBITOR DPP4_INHIBITOR ALPHA_GLUCOSIDASE_INHIBITOR AMYLINOMIMETIC RAPID_ACTING_INSULIN SHORT_ACTING_INSULIN INTERMEDIATE_ACTING_INSULIN LONG_ACTING_INSULIN MINOCYCLINE DOXYCYCLINE MELATONIN METHAZOLAMIDE HYDROXYCHLOROQUINE ITTC DMARDS OBESE_HST MORBIDOBESE_HST PH_HST AFIB_HST COPD_HST CHF_HST DIAB_HST CAD_HST OSTEO_HST HTN_HST CANCER_HST LUNG_CANCER_HST OVARIAN_CANCER_HST HEAD_NECK_CANCER_HST BREAST_CANCER_HST ASTHMA_HST GERD_HST FIBROMYALGIA_HST DEPRESSION_HST PSORIATIC_ARTHRITIS_HST RHEUM_ARTHRITIS_HST LUPUS_HST VTVF_HST STROKE_HST VASCULARDISEASE_HST LOWBACKPAIN_HST DVT_HST PE_HST HYPOTHYROIDISM_HST ADRENAL_INSUFFICIENCY_HST INFERTILITY_HST CKD_HST ESRD_HST OBS_SLEEPAPNEA_HST CARDIAC_ARREST_HST HEMO_STROKE_HST MAJOR_BLEED_HST MACULAR_DEGEN_HST ANXIETY_HST HYPERLIPIDEMIA_HST HIV_HST ALZHEIMER_HST COLORECTAL_CANCER_HST ENDOMETRIAL_CANCER_HST GLAUCOMA_HST HIP_PELVIC_FRACTURE_HST BENIGN_PROSTATIC_HYPERPLASIA_HST CIRRHOSIS_HST CIRRHOSIS_HST_1 CHOLESTEROL_CLOSEST HDL_CLOSEST LDL_CLOSEST TRIG_CLOSEST WBC_CLOSEST HGB_CLOSEST URIC_ACID_CLOSEST HCO3_CLOSEST SODIUM_CLOSEST CREATININE_CLOSEST EF_CLOSEST FEV1_CLOSEST EOS_CLOSEST NEUTRO_CLOSEST MONO_CLOSEST BASOPHIL_CLOSEST K_CLOSEST EGFR_CLOSEST TSH_CLOSEST T4_CLOSEST GLUCOSE_CLOSEST HBA1C_CLOSEST ESR_CLOSEST VITAMIN_D_CLOSEST MAGNESIUM_CLOSEST FOLICAC_CLOSEST VIT_B12_CLOSEST BNP_CLOSEST PLATELET_CLOSEST PA_PRESSURE_CLOSEST HEMATOCRIT_CLOSEST ALBUMIN_CLOSEST PREALBUMIN_CLOSEST MR_CLOSEST TR_CLOSEST MEANPLATELETVOL_CLOSEST MCH_CLOSEST RDW_CLOSEST MCV_CLOSEST MCHC_CLOSEST RBC_CLOSEST LYMPHOCYTE_CLOSEST CA125_CLOSEST BILIRUBIN_CLOSEST ALT_CLOSEST AST_CLOSEST CA_CLOSEST PHOSPHORUS_CLOSEST URINEPROTEIN_CLOSEST TOTALPREVIOUSHOSPVISITS TOTALPREVIOUSEDVISITS TOTALPREVIOUSPCPVISITS PREVIOUSSPECIALTYVISIT PREVIOUSURGENTCAREVISIT CAV_REC_SEX CAV_REC_LANG CAV_REC_AGE CAV_REC_IPOP CAV_REC_PRIORITY_CODE CAV_REC_DISP_CODE UREA_NITROGEN_MAX_1 UREA_NITROGEN_MIN_1 CALCIUM_MAX_1 CALCIUM_MIN_1 IRON_MAX_1 IRON_MIN_1 GLUCOSE_MAX_1 GLUCOSE_MIN_1 HGB_MAX_1 HGB_MIN_1 HEMATOCRIT_MAX_1 HEMATOCRIT_MIN_1 CHLORIDE_MAX_1 CHLORIDE_MIN_1 SODIUM_MAX_1 SODIUM_MIN_1 CREATININE_MAX_1 CREATININE_MIN_1 CARBON_DIOXIDE_MAX_1 CARBON_DIOXIDE_MIN_1 RBC_MAX_1 RBC_MIN_1 MCV_MAX_1 MCV_MIN_1 MCH_MAX_1 MCH_MIN_1 MCHC_MAX_1 MCHC_MIN_1 ANION_GAP_MAX_1 ANION_GAP_MIN_1 PLATELETS_MAX_1 PLATELETS_MIN_1 WBC_MAX_1 WBC_MIN_1 MEAN_PLATELET_VOLUME_MAX_1 MEAN_PLATELET_VOLUME_MIN_1 EGFR_MAX_1 EGFR_MIN_1 RDW_MAX_1 RDW_MIN_1 BASOPHILS_MAX_1 BASOPHILS_MIN_1 NEUTROPHILS_MAX_1 NEUTROPHILS_MIN_1 LYMPHOCYTES_MAX_1 LYMPHOCYTES_MIN_1 MONOCYTES_MAX_1 MONOCYTES_MIN_1 EOSINOPHILS_MAX_1 EOSINOPHILS_MIN_1 MAGNESIUM_MAX_1 MAGNESIUM_MIN_1 PHOSPHORUS_MAX_1 PHOSPHORUS_MIN_1 INR_MAX_1 INR_MIN_1 ALBUMIN_MAX_1 ALBUMIN_MIN_1 TOTAL_BILIRUBIN_MAX_1 TOTAL_BILIRUBIN_MIN_1 AST_MAX_1 AST_MIN_1 ALT_MAX_1 ALT_MIN_1 ALKALINE_PHOSPHATASE_MAX_1 ALKALINE_PHOSPHATASE_MIN_1 TOTAL_PROTEIN_MAX_1 TOTAL_PROTEIN_MIN_1 BUN_CREATININE_RATIO_MAX_1 ACTIVATED_PTT_MAX_1 BUN_CREATININE_RATIO_MIN_1 ACTIVATED_PTT_MIN_1 TROPONIN_I_MAX_1 TROPONIN_I_MIN_1 SPECIFIC_GRAVITY_URINE_MAX_1 SPECIFIC_GRAVITY_URINE_MIN_1 PROTEIN_URINE_MAX_1 PROTEIN_URINE_MIN_1 PH_URINE_MAX_1 PH_URINE_MIN_1 KETONES_URINE_MAX_1 KETONES_URINE_MIN_1 URINE_NITRITE_MAX_1 URINE_NITRITE_MIN_1 LEUKOCYTE_ESTERASE_MAX_1 LEUKOCYTE_ESTERASE_MIN_1 BLOOD_URINE_MAX_1 BLOOD_URINE_MIN_1 BILIRUBIN_URINE_MAX_1 BILIRUBIN_URINE_MIN_1 UROBILINOGEN_URINE_MAX_1 UROBILINOGEN_URINE_MIN_1 WHITE_BLOOD_CELLS_URINE_MAX_1 WHITE_BLOOD_CELLS_URINE_MIN_1 RED_BLOOD_CELLS_URINE_MAX_1 RED_BLOOD_CELLS_URINE_MIN_1 CALCULATED_OSMOLALITY_MAX_1 CALCULATED_OSMOLALITY_MIN_1 DIRECT_BILIRUBIN_MAX_1 DIRECT_BILIRUBIN_MIN_1 LACTATE_BLOOD_MAX_1 LACTATE_BLOOD_MIN_1 BACTERIA_MAX_1 BACTERIA_MIN_1 EPITHELIAL_CELLS_MAX_1 EPITHELIAL_CELLS_MIN_1 AG_RATIO_MAX_1 AG_RATIO_MIN_1 PCO2_ARTERIAL_MAX_1 PCO2_ARTERIAL_MIN_1 ADI_2015 LOS LOS_Binary
0 2 16 2 2/19/2018 0:00 0 2/13/2018 20:04 2/19/2018 7:00 640 0 73.3 28.21 3056.0 102.0 70.0 83.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 1.0 0.0 1.0 1.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 169.0 28.0 101.0 201.0 4.7 13.9 NaN 23.0 137.0 1.22 76.0 NaN 1.5 47.0 15.6 0.6 4.2 58.0 NaN NaN 91.0 5.6 NaN NaN 1.6 NaN NaN NaN 145.0 40.0 41.4 3.2 NaN 2.0 2.0 7.6 30.9 16.7 91.9 33.7 4.51 35.3 NaN 0.6 13.0 26.0 9.0 NaN NaN 1.0 NaN NaN 3.0 NaN 1 13 72.0 3 3 10 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 68.493519 7.660417 1
1 28 16 2 2/22/2018 0:00 12 1/23/2018 9:27 2/22/2018 8:00 1314 1 26.1 25.61 2240.0 96.0 58.0 NaN 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 NaN NaN NaN NaN 6.2 11.0 NaN NaN NaN NaN NaN NaN 0.4 74.6 6.5 0.3 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 177.0 NaN 32.1 NaN NaN NaN NaN 7.7 31.1 13.2 91.3 34.1 3.52 18.2 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 8.0 NaN 0 13 23.0 3 8 0 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 75.203533 2.575000 0
2 40 16 0 3/27/2018 0:00 16 3/6/2018 11:02 3/27/2018 7:15 1097 1 19.0 37.76 3520.0 NaN NaN NaN 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 NaN NaN NaN NaN 7.6 10.1 NaN 28.0 139.0 0.62 NaN NaN 0.0 73.0 11.0 0.0 4.2 153.0 NaN NaN 104.0 NaN NaN NaN 2.3 NaN NaN NaN 446.0 NaN 30.2 3.2 NaN NaN NaN 7.0 30.0 14.2 89.3 33.5 3.38 15.0 NaN 0.2 26.0 14.0 9.1 3.5 NaN 2.0 NaN NaN 2.0 NaN 0 13 18.0 3 5 10 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 59.740571 3.552083 0
3 40 16 2 6/8/2017 0:00 16 6/6/2017 14:19 6/8/2017 13:00 980 0 74.1 29.53 3200.0 NaN NaN NaN 0.0 0.0 1.0 0.0 0.0 0.0 1.0 0.0 1.0 0.0 0.0 1.0 1.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 NaN NaN NaN NaN 11.8 8.7 4.1 17.0 119.0 1.56 60.0 NaN 5.0 NaN 13.0 0.0 5.2 43.0 0.426 NaN 139.0 NaN 11.0 NaN 2.6 24.8 1195.0 441.0 198.0 49.0 25.5 3.5 15.0 2.0 2.0 8.3 29.8 15.7 90.7 32.8 2.80 5.0 NaN 0.3 15.0 13.0 8.5 4.7 NaN 4.0 NaN NaN 8.0 NaN 1 13 73.0 3 5 10 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 44.761703 5.778472 1
4 9 16 2 12/20/2017 0:00 3 12/7/2017 13:52 12/20/2017 10:00 1097 0 57.8 26.91 3398.4 120.0 60.0 NaN 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 183.0 66.0 106.0 54.0 7.9 14.3 NaN 30.0 139.0 0.85 NaN NaN 1.0 60.0 9.0 1.0 3.7 59.0 NaN NaN 86.0 NaN NaN NaN NaN NaN NaN NaN 205.0 NaN 41.5 3.8 24.0 NaN NaN 8.1 30.8 13.2 89.5 34.4 4.64 29.0 NaN 1.2 8.0 15.0 8.3 NaN NaN NaN NaN 3.0 4.0 NaN 1 13 51.0 1 3 30 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 44.718861 1.626389 0
In [24]:
# Splitting the data into training and testing
from sklearn.model_selection import train_test_split
x_train, x_test, y_train, y_test = train_test_split(train_df.drop(columns = ['LOS', 'LOS_Binary', 'PROC_DATE', 'CREATE_DT_TM', 'SCHED_START_DT_TM']), 
                                                    train_df['LOS_Binary'], 
                                                    test_size=0.2, 
                                                    random_state=1)
# Again, splitting into train and validation
x_train, x_val, y_train, y_val = train_test_split(x_train, y_train, test_size = 0.2, random_state = 1)

Choosing the 1st model:

1. CatBoost Model

In [25]:
x_train_cat = x_train.copy()
x_train_cat[['SCHED_SURG_AREA', 'RACE', 'ETHNIC_GROUP', 'SCHED_HOSPITAL', 'SCHED_SURG_PROC_CD', 'FEMALE', 'PCPVISIT', 'METFORMIN_FLAG', 'OPIOIDS_FLAG', 'ALPHA_BLOCKERS', 'CENTRAL_ANTAGONISTS', 'RENIN', 'BETA_BLOCKERS', 'ACE_INHIB', 'ARB', 'ALDOSTERONE_BLOCKERS', 'VASODIALATORS', 'DIURETICS', 'CALCIUM_BLOCKERS', 'STATINS', 'INSULIN_MEDS', 'ASPIRIN', 'WARFARIN', 'DOACS', 'PRETERM_17P', 'MEDROL', 'PREDNISONE', 'INHALED_STEROID_WITH_LABA', 'INHALED_STEROID_WITHOUT_LABA', 'INHALED_STEROIDS', 'ASTHMA_BIOLOGICS', 'SHORT_ACTING_BRONCHO_DIALATORS', 'TNF_INHIBITORS', 'IMMUNOMODULATORS', 'AMINOSALICYLATES', 'CORTICOSTEROIDS', 'ARNI', 'ALLOPURINOL', 'SEIZURE', 'MUSCLERELAXANT', 'DIGOXIN', 'INOTROPES', 'ANTI_ARRHYTHMIC', 'ANTIPLATELET', 'SULFONYLUREA', 'GLP_1_AGONIST', 'THIAZOLIDINEDIONE', 'SGLT2_INHIBITOR', 'DPP4_INHIBITOR', 'ALPHA_GLUCOSIDASE_INHIBITOR', 'AMYLINOMIMETIC', 'RAPID_ACTING_INSULIN', 'SHORT_ACTING_INSULIN', 'INTERMEDIATE_ACTING_INSULIN', 'LONG_ACTING_INSULIN', 'MINOCYCLINE', 'DOXYCYCLINE', 'MELATONIN', 'METHAZOLAMIDE', 'HYDROXYCHLOROQUINE', 'ITTC', 'DMARDS', 'OBESE_HST', 'MORBIDOBESE_HST', 'PH_HST', 'AFIB_HST', 'COPD_HST', 'CHF_HST', 'DIAB_HST', 'CAD_HST', 'OSTEO_HST', 'HTN_HST', 'CANCER_HST', 'LUNG_CANCER_HST', 'OVARIAN_CANCER_HST', 'HEAD_NECK_CANCER_HST', 'BREAST_CANCER_HST', 'ASTHMA_HST', 'GERD_HST', 'FIBROMYALGIA_HST', 'DEPRESSION_HST', 'PSORIATIC_ARTHRITIS_HST', 'RHEUM_ARTHRITIS_HST', 'LUPUS_HST', 'VTVF_HST', 'STROKE_HST', 'VASCULARDISEASE_HST', 'LOWBACKPAIN_HST', 'DVT_HST', 'PE_HST', 'HYPOTHYROIDISM_HST', 'ADRENAL_INSUFFICIENCY_HST', 'INFERTILITY_HST', 'CKD_HST', 'ESRD_HST', 'OBS_SLEEPAPNEA_HST', 'CARDIAC_ARREST_HST', 'HEMO_STROKE_HST', 'MAJOR_BLEED_HST', 'MACULAR_DEGEN_HST', 'ANXIETY_HST', 'HYPERLIPIDEMIA_HST', 'HIV_HST', 'ALZHEIMER_HST', 'COLORECTAL_CANCER_HST', 'ENDOMETRIAL_CANCER_HST', 'GLAUCOMA_HST', 'HIP_PELVIC_FRACTURE_HST', 'BENIGN_PROSTATIC_HYPERPLASIA_HST', 'CIRRHOSIS_HST', 'CIRRHOSIS_HST_1', 'CAV_REC_SEX', 'CAV_REC_LANG', 'CAV_REC_IPOP', 'CAV_REC_PRIORITY_CODE', 'CAV_REC_DISP_CODE']] = x_train_cat[['SCHED_SURG_AREA', 'RACE', 'ETHNIC_GROUP', 'SCHED_HOSPITAL', 'SCHED_SURG_PROC_CD', 'FEMALE', 'PCPVISIT', 'METFORMIN_FLAG', 'OPIOIDS_FLAG', 'ALPHA_BLOCKERS', 'CENTRAL_ANTAGONISTS', 'RENIN', 'BETA_BLOCKERS', 'ACE_INHIB', 'ARB', 'ALDOSTERONE_BLOCKERS', 'VASODIALATORS', 'DIURETICS', 'CALCIUM_BLOCKERS', 'STATINS', 'INSULIN_MEDS', 'ASPIRIN', 'WARFARIN', 'DOACS', 'PRETERM_17P', 'MEDROL', 'PREDNISONE', 'INHALED_STEROID_WITH_LABA', 'INHALED_STEROID_WITHOUT_LABA', 'INHALED_STEROIDS', 'ASTHMA_BIOLOGICS', 'SHORT_ACTING_BRONCHO_DIALATORS', 'TNF_INHIBITORS', 'IMMUNOMODULATORS', 'AMINOSALICYLATES', 'CORTICOSTEROIDS', 'ARNI', 'ALLOPURINOL', 'SEIZURE', 'MUSCLERELAXANT', 'DIGOXIN', 'INOTROPES', 'ANTI_ARRHYTHMIC', 'ANTIPLATELET', 'SULFONYLUREA', 'GLP_1_AGONIST', 'THIAZOLIDINEDIONE', 'SGLT2_INHIBITOR', 'DPP4_INHIBITOR', 'ALPHA_GLUCOSIDASE_INHIBITOR', 'AMYLINOMIMETIC', 'RAPID_ACTING_INSULIN', 'SHORT_ACTING_INSULIN', 'INTERMEDIATE_ACTING_INSULIN', 'LONG_ACTING_INSULIN', 'MINOCYCLINE', 'DOXYCYCLINE', 'MELATONIN', 'METHAZOLAMIDE', 'HYDROXYCHLOROQUINE', 'ITTC', 'DMARDS', 'OBESE_HST', 'MORBIDOBESE_HST', 'PH_HST', 'AFIB_HST', 'COPD_HST', 'CHF_HST', 'DIAB_HST', 'CAD_HST', 'OSTEO_HST', 'HTN_HST', 'CANCER_HST', 'LUNG_CANCER_HST', 'OVARIAN_CANCER_HST', 'HEAD_NECK_CANCER_HST', 'BREAST_CANCER_HST', 'ASTHMA_HST', 'GERD_HST', 'FIBROMYALGIA_HST', 'DEPRESSION_HST', 'PSORIATIC_ARTHRITIS_HST', 'RHEUM_ARTHRITIS_HST', 'LUPUS_HST', 'VTVF_HST', 'STROKE_HST', 'VASCULARDISEASE_HST', 'LOWBACKPAIN_HST', 'DVT_HST', 'PE_HST', 'HYPOTHYROIDISM_HST', 'ADRENAL_INSUFFICIENCY_HST', 'INFERTILITY_HST', 'CKD_HST', 'ESRD_HST', 'OBS_SLEEPAPNEA_HST', 'CARDIAC_ARREST_HST', 'HEMO_STROKE_HST', 'MAJOR_BLEED_HST', 'MACULAR_DEGEN_HST', 'ANXIETY_HST', 'HYPERLIPIDEMIA_HST', 'HIV_HST', 'ALZHEIMER_HST', 'COLORECTAL_CANCER_HST', 'ENDOMETRIAL_CANCER_HST', 'GLAUCOMA_HST', 'HIP_PELVIC_FRACTURE_HST', 'BENIGN_PROSTATIC_HYPERPLASIA_HST', 'CIRRHOSIS_HST', 'CIRRHOSIS_HST_1', 'CAV_REC_SEX', 'CAV_REC_LANG', 'CAV_REC_IPOP', 'CAV_REC_PRIORITY_CODE', 'CAV_REC_DISP_CODE']].astype(str)

x_val_cat = x_val.copy()
x_val_cat[['SCHED_SURG_AREA', 'RACE', 'ETHNIC_GROUP', 'SCHED_HOSPITAL', 'SCHED_SURG_PROC_CD', 'FEMALE', 'PCPVISIT', 'METFORMIN_FLAG', 'OPIOIDS_FLAG', 'ALPHA_BLOCKERS', 'CENTRAL_ANTAGONISTS', 'RENIN', 'BETA_BLOCKERS', 'ACE_INHIB', 'ARB', 'ALDOSTERONE_BLOCKERS', 'VASODIALATORS', 'DIURETICS', 'CALCIUM_BLOCKERS', 'STATINS', 'INSULIN_MEDS', 'ASPIRIN', 'WARFARIN', 'DOACS', 'PRETERM_17P', 'MEDROL', 'PREDNISONE', 'INHALED_STEROID_WITH_LABA', 'INHALED_STEROID_WITHOUT_LABA', 'INHALED_STEROIDS', 'ASTHMA_BIOLOGICS', 'SHORT_ACTING_BRONCHO_DIALATORS', 'TNF_INHIBITORS', 'IMMUNOMODULATORS', 'AMINOSALICYLATES', 'CORTICOSTEROIDS', 'ARNI', 'ALLOPURINOL', 'SEIZURE', 'MUSCLERELAXANT', 'DIGOXIN', 'INOTROPES', 'ANTI_ARRHYTHMIC', 'ANTIPLATELET', 'SULFONYLUREA', 'GLP_1_AGONIST', 'THIAZOLIDINEDIONE', 'SGLT2_INHIBITOR', 'DPP4_INHIBITOR', 'ALPHA_GLUCOSIDASE_INHIBITOR', 'AMYLINOMIMETIC', 'RAPID_ACTING_INSULIN', 'SHORT_ACTING_INSULIN', 'INTERMEDIATE_ACTING_INSULIN', 'LONG_ACTING_INSULIN', 'MINOCYCLINE', 'DOXYCYCLINE', 'MELATONIN', 'METHAZOLAMIDE', 'HYDROXYCHLOROQUINE', 'ITTC', 'DMARDS', 'OBESE_HST', 'MORBIDOBESE_HST', 'PH_HST', 'AFIB_HST', 'COPD_HST', 'CHF_HST', 'DIAB_HST', 'CAD_HST', 'OSTEO_HST', 'HTN_HST', 'CANCER_HST', 'LUNG_CANCER_HST', 'OVARIAN_CANCER_HST', 'HEAD_NECK_CANCER_HST', 'BREAST_CANCER_HST', 'ASTHMA_HST', 'GERD_HST', 'FIBROMYALGIA_HST', 'DEPRESSION_HST', 'PSORIATIC_ARTHRITIS_HST', 'RHEUM_ARTHRITIS_HST', 'LUPUS_HST', 'VTVF_HST', 'STROKE_HST', 'VASCULARDISEASE_HST', 'LOWBACKPAIN_HST', 'DVT_HST', 'PE_HST', 'HYPOTHYROIDISM_HST', 'ADRENAL_INSUFFICIENCY_HST', 'INFERTILITY_HST', 'CKD_HST', 'ESRD_HST', 'OBS_SLEEPAPNEA_HST', 'CARDIAC_ARREST_HST', 'HEMO_STROKE_HST', 'MAJOR_BLEED_HST', 'MACULAR_DEGEN_HST', 'ANXIETY_HST', 'HYPERLIPIDEMIA_HST', 'HIV_HST', 'ALZHEIMER_HST', 'COLORECTAL_CANCER_HST', 'ENDOMETRIAL_CANCER_HST', 'GLAUCOMA_HST', 'HIP_PELVIC_FRACTURE_HST', 'BENIGN_PROSTATIC_HYPERPLASIA_HST', 'CIRRHOSIS_HST', 'CIRRHOSIS_HST_1', 'CAV_REC_SEX', 'CAV_REC_LANG', 'CAV_REC_IPOP', 'CAV_REC_PRIORITY_CODE', 'CAV_REC_DISP_CODE']] = x_val_cat[['SCHED_SURG_AREA', 'RACE', 'ETHNIC_GROUP', 'SCHED_HOSPITAL', 'SCHED_SURG_PROC_CD', 'FEMALE', 'PCPVISIT', 'METFORMIN_FLAG', 'OPIOIDS_FLAG', 'ALPHA_BLOCKERS', 'CENTRAL_ANTAGONISTS', 'RENIN', 'BETA_BLOCKERS', 'ACE_INHIB', 'ARB', 'ALDOSTERONE_BLOCKERS', 'VASODIALATORS', 'DIURETICS', 'CALCIUM_BLOCKERS', 'STATINS', 'INSULIN_MEDS', 'ASPIRIN', 'WARFARIN', 'DOACS', 'PRETERM_17P', 'MEDROL', 'PREDNISONE', 'INHALED_STEROID_WITH_LABA', 'INHALED_STEROID_WITHOUT_LABA', 'INHALED_STEROIDS', 'ASTHMA_BIOLOGICS', 'SHORT_ACTING_BRONCHO_DIALATORS', 'TNF_INHIBITORS', 'IMMUNOMODULATORS', 'AMINOSALICYLATES', 'CORTICOSTEROIDS', 'ARNI', 'ALLOPURINOL', 'SEIZURE', 'MUSCLERELAXANT', 'DIGOXIN', 'INOTROPES', 'ANTI_ARRHYTHMIC', 'ANTIPLATELET', 'SULFONYLUREA', 'GLP_1_AGONIST', 'THIAZOLIDINEDIONE', 'SGLT2_INHIBITOR', 'DPP4_INHIBITOR', 'ALPHA_GLUCOSIDASE_INHIBITOR', 'AMYLINOMIMETIC', 'RAPID_ACTING_INSULIN', 'SHORT_ACTING_INSULIN', 'INTERMEDIATE_ACTING_INSULIN', 'LONG_ACTING_INSULIN', 'MINOCYCLINE', 'DOXYCYCLINE', 'MELATONIN', 'METHAZOLAMIDE', 'HYDROXYCHLOROQUINE', 'ITTC', 'DMARDS', 'OBESE_HST', 'MORBIDOBESE_HST', 'PH_HST', 'AFIB_HST', 'COPD_HST', 'CHF_HST', 'DIAB_HST', 'CAD_HST', 'OSTEO_HST', 'HTN_HST', 'CANCER_HST', 'LUNG_CANCER_HST', 'OVARIAN_CANCER_HST', 'HEAD_NECK_CANCER_HST', 'BREAST_CANCER_HST', 'ASTHMA_HST', 'GERD_HST', 'FIBROMYALGIA_HST', 'DEPRESSION_HST', 'PSORIATIC_ARTHRITIS_HST', 'RHEUM_ARTHRITIS_HST', 'LUPUS_HST', 'VTVF_HST', 'STROKE_HST', 'VASCULARDISEASE_HST', 'LOWBACKPAIN_HST', 'DVT_HST', 'PE_HST', 'HYPOTHYROIDISM_HST', 'ADRENAL_INSUFFICIENCY_HST', 'INFERTILITY_HST', 'CKD_HST', 'ESRD_HST', 'OBS_SLEEPAPNEA_HST', 'CARDIAC_ARREST_HST', 'HEMO_STROKE_HST', 'MAJOR_BLEED_HST', 'MACULAR_DEGEN_HST', 'ANXIETY_HST', 'HYPERLIPIDEMIA_HST', 'HIV_HST', 'ALZHEIMER_HST', 'COLORECTAL_CANCER_HST', 'ENDOMETRIAL_CANCER_HST', 'GLAUCOMA_HST', 'HIP_PELVIC_FRACTURE_HST', 'BENIGN_PROSTATIC_HYPERPLASIA_HST', 'CIRRHOSIS_HST', 'CIRRHOSIS_HST_1', 'CAV_REC_SEX', 'CAV_REC_LANG', 'CAV_REC_IPOP', 'CAV_REC_PRIORITY_CODE', 'CAV_REC_DISP_CODE']].astype(str)

x_test_cat = x_test.copy()
x_test_cat[['SCHED_SURG_AREA', 'RACE', 'ETHNIC_GROUP', 'SCHED_HOSPITAL', 'SCHED_SURG_PROC_CD', 'FEMALE', 'PCPVISIT', 'METFORMIN_FLAG', 'OPIOIDS_FLAG', 'ALPHA_BLOCKERS', 'CENTRAL_ANTAGONISTS', 'RENIN', 'BETA_BLOCKERS', 'ACE_INHIB', 'ARB', 'ALDOSTERONE_BLOCKERS', 'VASODIALATORS', 'DIURETICS', 'CALCIUM_BLOCKERS', 'STATINS', 'INSULIN_MEDS', 'ASPIRIN', 'WARFARIN', 'DOACS', 'PRETERM_17P', 'MEDROL', 'PREDNISONE', 'INHALED_STEROID_WITH_LABA', 'INHALED_STEROID_WITHOUT_LABA', 'INHALED_STEROIDS', 'ASTHMA_BIOLOGICS', 'SHORT_ACTING_BRONCHO_DIALATORS', 'TNF_INHIBITORS', 'IMMUNOMODULATORS', 'AMINOSALICYLATES', 'CORTICOSTEROIDS', 'ARNI', 'ALLOPURINOL', 'SEIZURE', 'MUSCLERELAXANT', 'DIGOXIN', 'INOTROPES', 'ANTI_ARRHYTHMIC', 'ANTIPLATELET', 'SULFONYLUREA', 'GLP_1_AGONIST', 'THIAZOLIDINEDIONE', 'SGLT2_INHIBITOR', 'DPP4_INHIBITOR', 'ALPHA_GLUCOSIDASE_INHIBITOR', 'AMYLINOMIMETIC', 'RAPID_ACTING_INSULIN', 'SHORT_ACTING_INSULIN', 'INTERMEDIATE_ACTING_INSULIN', 'LONG_ACTING_INSULIN', 'MINOCYCLINE', 'DOXYCYCLINE', 'MELATONIN', 'METHAZOLAMIDE', 'HYDROXYCHLOROQUINE', 'ITTC', 'DMARDS', 'OBESE_HST', 'MORBIDOBESE_HST', 'PH_HST', 'AFIB_HST', 'COPD_HST', 'CHF_HST', 'DIAB_HST', 'CAD_HST', 'OSTEO_HST', 'HTN_HST', 'CANCER_HST', 'LUNG_CANCER_HST', 'OVARIAN_CANCER_HST', 'HEAD_NECK_CANCER_HST', 'BREAST_CANCER_HST', 'ASTHMA_HST', 'GERD_HST', 'FIBROMYALGIA_HST', 'DEPRESSION_HST', 'PSORIATIC_ARTHRITIS_HST', 'RHEUM_ARTHRITIS_HST', 'LUPUS_HST', 'VTVF_HST', 'STROKE_HST', 'VASCULARDISEASE_HST', 'LOWBACKPAIN_HST', 'DVT_HST', 'PE_HST', 'HYPOTHYROIDISM_HST', 'ADRENAL_INSUFFICIENCY_HST', 'INFERTILITY_HST', 'CKD_HST', 'ESRD_HST', 'OBS_SLEEPAPNEA_HST', 'CARDIAC_ARREST_HST', 'HEMO_STROKE_HST', 'MAJOR_BLEED_HST', 'MACULAR_DEGEN_HST', 'ANXIETY_HST', 'HYPERLIPIDEMIA_HST', 'HIV_HST', 'ALZHEIMER_HST', 'COLORECTAL_CANCER_HST', 'ENDOMETRIAL_CANCER_HST', 'GLAUCOMA_HST', 'HIP_PELVIC_FRACTURE_HST', 'BENIGN_PROSTATIC_HYPERPLASIA_HST', 'CIRRHOSIS_HST', 'CIRRHOSIS_HST_1', 'CAV_REC_SEX', 'CAV_REC_LANG', 'CAV_REC_IPOP', 'CAV_REC_PRIORITY_CODE', 'CAV_REC_DISP_CODE']] = x_test_cat[['SCHED_SURG_AREA', 'RACE', 'ETHNIC_GROUP', 'SCHED_HOSPITAL', 'SCHED_SURG_PROC_CD', 'FEMALE', 'PCPVISIT', 'METFORMIN_FLAG', 'OPIOIDS_FLAG', 'ALPHA_BLOCKERS', 'CENTRAL_ANTAGONISTS', 'RENIN', 'BETA_BLOCKERS', 'ACE_INHIB', 'ARB', 'ALDOSTERONE_BLOCKERS', 'VASODIALATORS', 'DIURETICS', 'CALCIUM_BLOCKERS', 'STATINS', 'INSULIN_MEDS', 'ASPIRIN', 'WARFARIN', 'DOACS', 'PRETERM_17P', 'MEDROL', 'PREDNISONE', 'INHALED_STEROID_WITH_LABA', 'INHALED_STEROID_WITHOUT_LABA', 'INHALED_STEROIDS', 'ASTHMA_BIOLOGICS', 'SHORT_ACTING_BRONCHO_DIALATORS', 'TNF_INHIBITORS', 'IMMUNOMODULATORS', 'AMINOSALICYLATES', 'CORTICOSTEROIDS', 'ARNI', 'ALLOPURINOL', 'SEIZURE', 'MUSCLERELAXANT', 'DIGOXIN', 'INOTROPES', 'ANTI_ARRHYTHMIC', 'ANTIPLATELET', 'SULFONYLUREA', 'GLP_1_AGONIST', 'THIAZOLIDINEDIONE', 'SGLT2_INHIBITOR', 'DPP4_INHIBITOR', 'ALPHA_GLUCOSIDASE_INHIBITOR', 'AMYLINOMIMETIC', 'RAPID_ACTING_INSULIN', 'SHORT_ACTING_INSULIN', 'INTERMEDIATE_ACTING_INSULIN', 'LONG_ACTING_INSULIN', 'MINOCYCLINE', 'DOXYCYCLINE', 'MELATONIN', 'METHAZOLAMIDE', 'HYDROXYCHLOROQUINE', 'ITTC', 'DMARDS', 'OBESE_HST', 'MORBIDOBESE_HST', 'PH_HST', 'AFIB_HST', 'COPD_HST', 'CHF_HST', 'DIAB_HST', 'CAD_HST', 'OSTEO_HST', 'HTN_HST', 'CANCER_HST', 'LUNG_CANCER_HST', 'OVARIAN_CANCER_HST', 'HEAD_NECK_CANCER_HST', 'BREAST_CANCER_HST', 'ASTHMA_HST', 'GERD_HST', 'FIBROMYALGIA_HST', 'DEPRESSION_HST', 'PSORIATIC_ARTHRITIS_HST', 'RHEUM_ARTHRITIS_HST', 'LUPUS_HST', 'VTVF_HST', 'STROKE_HST', 'VASCULARDISEASE_HST', 'LOWBACKPAIN_HST', 'DVT_HST', 'PE_HST', 'HYPOTHYROIDISM_HST', 'ADRENAL_INSUFFICIENCY_HST', 'INFERTILITY_HST', 'CKD_HST', 'ESRD_HST', 'OBS_SLEEPAPNEA_HST', 'CARDIAC_ARREST_HST', 'HEMO_STROKE_HST', 'MAJOR_BLEED_HST', 'MACULAR_DEGEN_HST', 'ANXIETY_HST', 'HYPERLIPIDEMIA_HST', 'HIV_HST', 'ALZHEIMER_HST', 'COLORECTAL_CANCER_HST', 'ENDOMETRIAL_CANCER_HST', 'GLAUCOMA_HST', 'HIP_PELVIC_FRACTURE_HST', 'BENIGN_PROSTATIC_HYPERPLASIA_HST', 'CIRRHOSIS_HST', 'CIRRHOSIS_HST_1', 'CAV_REC_SEX', 'CAV_REC_LANG', 'CAV_REC_IPOP', 'CAV_REC_PRIORITY_CODE', 'CAV_REC_DISP_CODE']].astype(str)
In [26]:
# Checking sample data
x_train.head(20)
Out[26]:
SCHED_SURG_AREA RACE ETHNIC_GROUP SCHED_HOSPITAL SCHED_SURG_PROC_CD FEMALE AGE_ON_CONTACT_DATE BMI WEIGHT BP_SYSTOLIC BP_DIASTOLIC PULSE PCPVISIT METFORMIN_FLAG OPIOIDS_FLAG ALPHA_BLOCKERS CENTRAL_ANTAGONISTS RENIN BETA_BLOCKERS ACE_INHIB ARB ALDOSTERONE_BLOCKERS VASODIALATORS DIURETICS CALCIUM_BLOCKERS STATINS INSULIN_MEDS ASPIRIN WARFARIN DOACS PRETERM_17P MEDROL PREDNISONE INHALED_STEROID_WITH_LABA INHALED_STEROID_WITHOUT_LABA INHALED_STEROIDS ASTHMA_BIOLOGICS SHORT_ACTING_BRONCHO_DIALATORS TNF_INHIBITORS IMMUNOMODULATORS AMINOSALICYLATES CORTICOSTEROIDS ARNI ALLOPURINOL SEIZURE MUSCLERELAXANT DIGOXIN INOTROPES ANTI_ARRHYTHMIC ANTIPLATELET SULFONYLUREA GLP_1_AGONIST THIAZOLIDINEDIONE SGLT2_INHIBITOR DPP4_INHIBITOR ALPHA_GLUCOSIDASE_INHIBITOR AMYLINOMIMETIC RAPID_ACTING_INSULIN SHORT_ACTING_INSULIN INTERMEDIATE_ACTING_INSULIN LONG_ACTING_INSULIN MINOCYCLINE DOXYCYCLINE MELATONIN METHAZOLAMIDE HYDROXYCHLOROQUINE ITTC DMARDS OBESE_HST MORBIDOBESE_HST PH_HST AFIB_HST COPD_HST CHF_HST DIAB_HST CAD_HST OSTEO_HST HTN_HST CANCER_HST LUNG_CANCER_HST OVARIAN_CANCER_HST HEAD_NECK_CANCER_HST BREAST_CANCER_HST ASTHMA_HST GERD_HST FIBROMYALGIA_HST DEPRESSION_HST PSORIATIC_ARTHRITIS_HST RHEUM_ARTHRITIS_HST LUPUS_HST VTVF_HST STROKE_HST VASCULARDISEASE_HST LOWBACKPAIN_HST DVT_HST PE_HST HYPOTHYROIDISM_HST ADRENAL_INSUFFICIENCY_HST INFERTILITY_HST CKD_HST ESRD_HST OBS_SLEEPAPNEA_HST CARDIAC_ARREST_HST HEMO_STROKE_HST MAJOR_BLEED_HST MACULAR_DEGEN_HST ANXIETY_HST HYPERLIPIDEMIA_HST HIV_HST ALZHEIMER_HST COLORECTAL_CANCER_HST ENDOMETRIAL_CANCER_HST GLAUCOMA_HST HIP_PELVIC_FRACTURE_HST BENIGN_PROSTATIC_HYPERPLASIA_HST CIRRHOSIS_HST CIRRHOSIS_HST_1 CHOLESTEROL_CLOSEST HDL_CLOSEST LDL_CLOSEST TRIG_CLOSEST WBC_CLOSEST HGB_CLOSEST URIC_ACID_CLOSEST HCO3_CLOSEST SODIUM_CLOSEST CREATININE_CLOSEST EF_CLOSEST FEV1_CLOSEST EOS_CLOSEST NEUTRO_CLOSEST MONO_CLOSEST BASOPHIL_CLOSEST K_CLOSEST EGFR_CLOSEST TSH_CLOSEST T4_CLOSEST GLUCOSE_CLOSEST HBA1C_CLOSEST ESR_CLOSEST VITAMIN_D_CLOSEST MAGNESIUM_CLOSEST FOLICAC_CLOSEST VIT_B12_CLOSEST BNP_CLOSEST PLATELET_CLOSEST PA_PRESSURE_CLOSEST HEMATOCRIT_CLOSEST ALBUMIN_CLOSEST PREALBUMIN_CLOSEST MR_CLOSEST TR_CLOSEST MEANPLATELETVOL_CLOSEST MCH_CLOSEST RDW_CLOSEST MCV_CLOSEST MCHC_CLOSEST RBC_CLOSEST LYMPHOCYTE_CLOSEST CA125_CLOSEST BILIRUBIN_CLOSEST ALT_CLOSEST AST_CLOSEST CA_CLOSEST PHOSPHORUS_CLOSEST URINEPROTEIN_CLOSEST TOTALPREVIOUSHOSPVISITS TOTALPREVIOUSEDVISITS TOTALPREVIOUSPCPVISITS PREVIOUSSPECIALTYVISIT PREVIOUSURGENTCAREVISIT CAV_REC_SEX CAV_REC_LANG CAV_REC_AGE CAV_REC_IPOP CAV_REC_PRIORITY_CODE CAV_REC_DISP_CODE UREA_NITROGEN_MAX_1 UREA_NITROGEN_MIN_1 CALCIUM_MAX_1 CALCIUM_MIN_1 IRON_MAX_1 IRON_MIN_1 GLUCOSE_MAX_1 GLUCOSE_MIN_1 HGB_MAX_1 HGB_MIN_1 HEMATOCRIT_MAX_1 HEMATOCRIT_MIN_1 CHLORIDE_MAX_1 CHLORIDE_MIN_1 SODIUM_MAX_1 SODIUM_MIN_1 CREATININE_MAX_1 CREATININE_MIN_1 CARBON_DIOXIDE_MAX_1 CARBON_DIOXIDE_MIN_1 RBC_MAX_1 RBC_MIN_1 MCV_MAX_1 MCV_MIN_1 MCH_MAX_1 MCH_MIN_1 MCHC_MAX_1 MCHC_MIN_1 ANION_GAP_MAX_1 ANION_GAP_MIN_1 PLATELETS_MAX_1 PLATELETS_MIN_1 WBC_MAX_1 WBC_MIN_1 MEAN_PLATELET_VOLUME_MAX_1 MEAN_PLATELET_VOLUME_MIN_1 EGFR_MAX_1 EGFR_MIN_1 RDW_MAX_1 RDW_MIN_1 BASOPHILS_MAX_1 BASOPHILS_MIN_1 NEUTROPHILS_MAX_1 NEUTROPHILS_MIN_1 LYMPHOCYTES_MAX_1 LYMPHOCYTES_MIN_1 MONOCYTES_MAX_1 MONOCYTES_MIN_1 EOSINOPHILS_MAX_1 EOSINOPHILS_MIN_1 MAGNESIUM_MAX_1 MAGNESIUM_MIN_1 PHOSPHORUS_MAX_1 PHOSPHORUS_MIN_1 INR_MAX_1 INR_MIN_1 ALBUMIN_MAX_1 ALBUMIN_MIN_1 TOTAL_BILIRUBIN_MAX_1 TOTAL_BILIRUBIN_MIN_1 AST_MAX_1 AST_MIN_1 ALT_MAX_1 ALT_MIN_1 ALKALINE_PHOSPHATASE_MAX_1 ALKALINE_PHOSPHATASE_MIN_1 TOTAL_PROTEIN_MAX_1 TOTAL_PROTEIN_MIN_1 BUN_CREATININE_RATIO_MAX_1 ACTIVATED_PTT_MAX_1 BUN_CREATININE_RATIO_MIN_1 ACTIVATED_PTT_MIN_1 TROPONIN_I_MAX_1 TROPONIN_I_MIN_1 SPECIFIC_GRAVITY_URINE_MAX_1 SPECIFIC_GRAVITY_URINE_MIN_1 PROTEIN_URINE_MAX_1 PROTEIN_URINE_MIN_1 PH_URINE_MAX_1 PH_URINE_MIN_1 KETONES_URINE_MAX_1 KETONES_URINE_MIN_1 URINE_NITRITE_MAX_1 URINE_NITRITE_MIN_1 LEUKOCYTE_ESTERASE_MAX_1 LEUKOCYTE_ESTERASE_MIN_1 BLOOD_URINE_MAX_1 BLOOD_URINE_MIN_1 BILIRUBIN_URINE_MAX_1 BILIRUBIN_URINE_MIN_1 UROBILINOGEN_URINE_MAX_1 UROBILINOGEN_URINE_MIN_1 WHITE_BLOOD_CELLS_URINE_MAX_1 WHITE_BLOOD_CELLS_URINE_MIN_1 RED_BLOOD_CELLS_URINE_MAX_1 RED_BLOOD_CELLS_URINE_MIN_1 CALCULATED_OSMOLALITY_MAX_1 CALCULATED_OSMOLALITY_MIN_1 DIRECT_BILIRUBIN_MAX_1 DIRECT_BILIRUBIN_MIN_1 LACTATE_BLOOD_MAX_1 LACTATE_BLOOD_MIN_1 BACTERIA_MAX_1 BACTERIA_MIN_1 EPITHELIAL_CELLS_MAX_1 EPITHELIAL_CELLS_MIN_1 AG_RATIO_MAX_1 AG_RATIO_MIN_1 PCO2_ARTERIAL_MAX_1 PCO2_ARTERIAL_MIN_1 ADI_2015
36116 19 2 2 8 1404 1 63.1 16.47 1488.0 158.0 90.0 115.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 NaN NaN NaN NaN 2.7 9.4 NaN 26.1 137.0 0.70 NaN NaN NaN NaN NaN NaN 4.3 60.0 NaN NaN 109.0 NaN NaN NaN 2.0 NaN NaN NaN 289.0 NaN 30.9 4.0 NaN NaN NaN 8.7 20.4 25.5 67.0 30.4 4.61 NaN NaN 0.4 32.0 56.0 10.9 3.7 1.0 1.0 NaN NaN 2.0 NaN 0 13 63.0 3 3 14 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 79.080879
13231 29 17 4 12 1015 2 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 1.0 NaN NaN NaN 1 13 75.0 1 3 30 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 80.571579
56836 29 16 2 12 825 1 74.7 31.01 3360.0 110.0 54.0 99.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 1.0 1.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 1.0 0.0 0.0 1.0 1.0 0.0 0.0 0.0 1.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 110.0 24.0 45.0 203.0 4.6 10.0 NaN 25.0 138.0 2.01 60.0 NaN 2.5 65.1 8.2 0.4 4.7 24.0 2.080 NaN 177.0 8.2 NaN 18.00 2.0 20.0 1710.0 NaN 84.0 37.0 30.1 3.4 5.96 2.0 1.0 9.1 30.2 15.8 90.9 33.3 3.31 23.8 NaN 0.4 21.0 37.0 9.1 2.7 3.0 4.0 5.0 19.0 23.0 NaN 0 13 74.0 1 3 30 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 80.571579
61971 35 17 4 14 673 2 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 1 13 43.0 2 15 15 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 50.716464
1532 19 2 2 8 1336 0 75.3 37.84 4160.0 135.0 90.0 86.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 1.0 0.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 1.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 NaN NaN NaN NaN 6.2 11.9 NaN 29.0 145.0 1.21 NaN NaN 3.9 66.4 10.4 0.2 3.9 58.0 NaN NaN 92.0 NaN NaN NaN NaN NaN NaN NaN 146.0 NaN 36.9 3.1 NaN NaN NaN 8.5 28.0 13.4 84.5 33.1 4.37 19.1 NaN 0.7 20.0 18.0 8.6 NaN NaN NaN NaN NaN 6.0 NaN 1 13 74.0 4 3 30 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 62.224959
25029 10 16 2 3 1404 1 82.2 NaN NaN NaN NaN NaN 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 1.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 1.0 0.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 188.0 71.0 102.0 65.0 3.1 14.1 NaN 26.0 128.0 0.95 NaN NaN 2.0 58.0 11.0 1.0 4.3 59.0 3.410 NaN 100.0 5.3 NaN NaN NaN 24.0 811.0 NaN 133.0 NaN 40.5 3.8 NaN NaN NaN 8.4 31.8 13.4 91.5 34.8 4.43 28.0 NaN 0.6 21.0 15.0 8.8 NaN 0.0 NaN 7.0 4.0 2.0 NaN 0 13 82.0 1 3 30 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 95.235095
1059 37 16 2 15 1095 0 37.2 30.94 3856.0 122.0 89.0 84.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 NaN NaN NaN NaN 8.8 14.8 NaN 25.0 138.0 0.87 NaN NaN 5.5 51.0 9.7 0.7 3.8 128.0 NaN NaN 83.0 NaN NaN NaN NaN NaN NaN NaN 271.0 NaN 42.8 3.9 NaN NaN NaN 8.6 30.0 13.1 87.0 34.5 4.92 33.1 NaN 0.3 32.0 20.0 9.1 NaN NaN NaN 1.0 7.0 8.0 NaN 1 13 36.0 1 3 30 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 60.770270
68346 35 16 2 14 1307 1 31.0 39.16 4000.0 NaN NaN NaN 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 NaN NaN NaN NaN 8.2 15.0 NaN 26.0 139.0 0.80 60.0 NaN 2.0 68.0 4.0 1.0 4.3 60.0 NaN NaN 90.0 NaN NaN NaN NaN NaN NaN NaN 260.0 16.0 43.8 4.5 21.00 2.5 1.0 8.2 30.5 13.6 88.9 34.3 4.92 25.0 NaN 0.7 42.0 27.0 10.1 NaN NaN NaN 2.0 NaN 7.0 NaN 0 13 31.0 1 3 30 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 69.477321
23598 31 16 2 13 465 1 46.3 29.63 2592.0 127.0 87.0 111.0 0.0 1.0 1.0 0.0 0.0 0.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 1.0 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 247.0 36.0 157.0 270.0 14.5 15.0 NaN 21.0 134.0 0.70 NaN NaN 2.0 76.0 6.0 0.0 5.3 120.0 NaN NaN 186.0 8.8 NaN NaN 1.9 NaN NaN NaN 292.0 NaN 42.7 4.0 NaN NaN NaN 7.7 32.3 19.6 91.9 35.2 4.65 16.0 NaN 0.3 21.0 14.0 9.7 NaN NaN 3.0 5.0 NaN 2.0 NaN 0 13 46.0 3 15 0 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 42.276914
19316 14 16 2 5 205 1 80.7 35.15 2880.0 103.0 66.0 83.0 0.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 1.0 0.0 0.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 1.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 1.0 1.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 NaN NaN NaN NaN NaN NaN NaN 24.0 141.0 0.64 NaN NaN NaN NaN NaN NaN 4.1 59.0 NaN NaN 124.0 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 2.8 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 8.6 3.3 NaN NaN NaN NaN 4.0 NaN 0 13 79.0 3 16 7 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 67.030720
70451 14 17 4 5 1030 2 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 2 54 NaN 5 19 58 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN
59960 42 16 2 17 208 0 68.3 31.17 3184.0 126.0 76.0 88.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 1.0 1.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 1.0 0.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 139.0 33.0 82.0 141.0 4.3 14.5 NaN 29.0 137.0 1.63 55.0 115.0 10.0 68.0 16.0 0.0 4.4 43.0 3.030 NaN 110.0 6.1 NaN 14.00 1.4 NaN NaN NaN 204.0 NaN 43.0 4.2 NaN NaN NaN 8.3 26.6 14.1 79.3 33.6 5.43 18.0 NaN 0.9 16.0 20.0 9.7 2.9 NaN NaN 1.0 4.0 NaN 3.0 1 13 67.0 1 3 28 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 67.582759
28666 31 2 2 13 635 0 72.3 33.20 4480.0 NaN NaN NaN 0.0 0.0 1.0 0.0 0.0 0.0 1.0 1.0 0.0 0.0 0.0 1.0 1.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 NaN NaN NaN NaN 4.9 11.8 NaN 31.0 140.0 1.53 NaN NaN NaN NaN NaN NaN 4.1 45.0 NaN NaN 144.0 6.8 NaN 13.79 NaN NaN NaN NaN 163.0 NaN 35.4 3.7 25.90 NaN NaN 8.4 29.9 NaN 89.6 33.3 3.95 NaN NaN NaN NaN NaN 8.4 NaN NaN NaN NaN NaN 7.0 NaN 1 13 71.0 3 3 10 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 83.637180
78623 26 16 2 11 396 1 77.2 NaN NaN 152.0 78.0 72.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 1.0 0.0 0.0 1.0 0.0 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 NaN NaN NaN NaN 11.3 9.4 NaN 24.0 134.0 0.47 60.0 NaN 1.2 81.1 7.0 0.4 4.1 59.0 3.782 NaN 84.0 NaN NaN 6.00 2.1 23.0 523.0 2359.0 355.0 44.0 30.5 1.9 6.80 2.0 2.5 6.8 25.0 18.8 84.6 33.0 3.45 14.0 NaN NaN NaN NaN 8.4 2.9 NaN 1.0 NaN NaN 1.0 NaN 0 13 77.0 3 16 3 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 0.012928
15195 37 16 2 15 836 1 60.8 33.27 2864.0 124.0 80.0 71.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 NaN NaN NaN NaN 11.2 14.3 NaN 25.0 143.0 0.89 55.0 NaN 1.7 79.0 4.0 0.3 3.7 70.0 NaN NaN 157.0 NaN NaN NaN NaN NaN NaN NaN 297.0 NaN 43.5 3.8 NaN NaN NaN 8.1 29.1 13.7 88.4 32.9 4.93 15.0 NaN 0.2 29.0 17.0 9.5 NaN 3.0 NaN 1.0 2.0 5.0 1.0 0 13 60.0 1 3 30 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 89.445526
63808 42 16 2 17 198 1 74.1 24.89 2320.0 118.0 82.0 87.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 1.0 0.0 1.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 1.0 1.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 136.0 61.0 53.0 108.0 6.7 14.3 NaN 24.0 140.0 1.14 45.0 NaN 1.0 58.0 8.0 1.0 4.1 47.0 1.370 NaN 97.0 5.8 NaN 45.00 NaN NaN 940.0 NaN 188.0 37.0 43.1 4.3 NaN 3.0 3.0 7.4 32.0 13.6 96.4 33.2 4.47 32.0 NaN 0.6 33.0 38.0 9.6 NaN 1.0 NaN 1.0 5.0 3.0 NaN 0 13 74.0 1 3 28 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 64.393686
3116 39 16 2 16 1347 0 65.6 32.86 3664.0 117.0 78.0 77.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 1.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 1.0 1.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 181.0 30.0 115.0 180.0 12.4 13.2 5.0 25.0 135.0 1.00 50.0 NaN NaN NaN NaN NaN 4.2 60.0 2.149 1.35 158.0 8.4 NaN NaN 2.1 NaN NaN NaN 271.0 NaN 38.6 2.6 NaN NaN NaN 10.1 31.2 16.4 91.0 34.3 4.24 NaN NaN 12.1 63.0 158.0 9.7 3.5 NaN NaN NaN 5.0 1.0 NaN 1 13 65.0 2 3 34 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 74.978456
77992 21 16 2 8 1015 0 34.5 31.14 3776.0 165.0 103.0 98.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 NaN NaN NaN NaN 9.7 13.8 NaN 26.0 140.0 0.92 NaN NaN 2.5 50.1 7.7 1.3 4.6 59.0 NaN NaN 110.0 NaN NaN NaN 2.0 NaN NaN NaN 233.0 NaN 40.7 4.0 NaN NaN NaN 8.6 31.9 13.0 93.9 33.9 4.33 38.4 NaN 0.6 25.0 10.0 8.1 3.2 NaN 2.0 6.0 NaN NaN NaN 1 13 36.0 4 3 30 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 62.224959
66472 26 16 2 11 1095 0 77.1 26.50 2707.0 122.0 78.0 64.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 136.0 45.0 69.0 109.0 6.1 14.6 NaN 28.0 140.0 1.18 NaN NaN 1.0 67.0 10.0 1.0 4.9 59.0 3.360 NaN 88.0 NaN NaN NaN NaN NaN NaN NaN 165.0 NaN 45.4 4.1 NaN NaN NaN 10.3 29.2 14.1 90.9 32.2 5.00 22.0 NaN 0.5 12.0 19.0 9.2 NaN NaN NaN NaN NaN 4.0 NaN 2 54 NaN 5 19 58 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 79.080879
10105 21 17 4 8 1046 2 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 1 13 25.0 1 3 30 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 62.224959
In [27]:
# Create index for categorical variables
predictors = x_train_cat
categorical_var = np.where(predictors.dtypes != np.float)[0]
print('\nCategorical Variables indices : ',categorical_var)
Categorical Variables indices :  [  0   1   2   3   4   5  12  13  14  15  16  17  18  19  20  21  22  23
  24  25  26  27  28  29  30  31  32  33  34  35  36  37  38  39  40  41
  42  43  44  45  46  47  48  49  50  51  52  53  54  55  56  57  58  59
  60  61  62  63  64  65  66  67  68  69  70  71  72  73  74  75  76  77
  78  79  80  81  82  83  84  85  86  87  88  89  90  91  92  93  94  95
  96  97  98  99 100 101 102 103 104 105 106 107 108 109 110 111 112 113
 114 115 116 171 172 174 175 176]
In [28]:
# Importing CatBoostClassifier, Pool, cv from CatBoost
from catboost import CatBoostClassifier, Pool, cv
In [29]:
# Setting parameters and implementing Logloss in CatBoostClassifier
cat_boost_model = CatBoostClassifier(
    loss_function = 'Logloss',
    random_seed=42,
    iterations = 10,
    learning_rate = 0.03,
    early_stopping_rounds = 10,
    #l2_leaf_reg = ???
    depth = 3    
)
In [30]:
# Fitting CatBoost model
cat_boost_model.fit(
    x_train_cat, y_train
    ,cat_features=categorical_var,
    eval_set=(x_val_cat, y_val)
    , plot = True
)
0:	learn: 0.6775012	test: 0.6776091	best: 0.6776091 (0)	total: 280ms	remaining: 2.52s
1:	learn: 0.6631555	test: 0.6634204	best: 0.6634204 (1)	total: 424ms	remaining: 1.69s
2:	learn: 0.6493453	test: 0.6498030	best: 0.6498030 (2)	total: 665ms	remaining: 1.55s
3:	learn: 0.6369213	test: 0.6374466	best: 0.6374466 (3)	total: 825ms	remaining: 1.24s
4:	learn: 0.6256758	test: 0.6264573	best: 0.6264573 (4)	total: 1.12s	remaining: 1.12s
5:	learn: 0.6137253	test: 0.6147936	best: 0.6147936 (5)	total: 1.23s	remaining: 822ms
6:	learn: 0.6028308	test: 0.6038549	best: 0.6038549 (6)	total: 1.42s	remaining: 607ms
7:	learn: 0.5933029	test: 0.5945219	best: 0.5945219 (7)	total: 1.55s	remaining: 389ms
8:	learn: 0.5842418	test: 0.5857013	best: 0.5857013 (8)	total: 1.7s	remaining: 188ms
9:	learn: 0.5761256	test: 0.5778106	best: 0.5778106 (9)	total: 1.81s	remaining: 0us

bestTest = 0.5778106395
bestIteration = 9

Out[30]:
<catboost.core.CatBoostClassifier at 0x124bcd518>
In [31]:
# CatBoost Probabilities
catboost_probs_train = cat_boost_model.predict_proba(x_train_cat)
catboost_probs = cat_boost_model.predict_proba(x_test_cat)
In [32]:
import sklearn.metrics as metrics
In [33]:
catboost_probs_df_train = pd.DataFrame(catboost_probs_train)
catboost_probs_df_train = catboost_probs_df_train.add_prefix('cat')

catboost_probs_df = pd.DataFrame(catboost_probs)
catboost_probs_df = catboost_probs_df.add_prefix('cat')
fprc, tprc, thresholds = metrics.roc_curve(y_train, catboost_probs_df_train['cat1'])
metrics.auc(fprc, tprc)
Out[33]:
0.7645816237012635
In [34]:
print('CatBoost Train AUC:', metrics.auc(fprc, tprc))
CatBoost Train AUC: 0.7645816237012635
In [35]:
fprc, tprc, thresholds = metrics.roc_curve(y_test, catboost_probs_df['cat1'])
metrics.auc(fprc, tprc)
Out[35]:
0.7586551484312447
In [36]:
print('CatBoost Test AUC:', metrics.auc(fprc, tprc))
CatBoost Test AUC: 0.7586551484312447

So, the CatBoost Train AUC is 0.7645816237012635 and Test AUC is 0.7586551484312447.

The difference of Train and Test accuracy is approximately 0.0059 and we can say that the model is fit-fine.

However, let's wait and check what results we obtain with the other two 'Gradient-Boosted Decision Trees' - XGBoost & LightGBM.

Choosing the 2nd model:

2. XGBoost Model

In [37]:
# Importing XGBoost
import xgboost as xgb
In [38]:
# Checking sample data from x_train
x_train.head()
Out[38]:
SCHED_SURG_AREA RACE ETHNIC_GROUP SCHED_HOSPITAL SCHED_SURG_PROC_CD FEMALE AGE_ON_CONTACT_DATE BMI WEIGHT BP_SYSTOLIC BP_DIASTOLIC PULSE PCPVISIT METFORMIN_FLAG OPIOIDS_FLAG ALPHA_BLOCKERS CENTRAL_ANTAGONISTS RENIN BETA_BLOCKERS ACE_INHIB ARB ALDOSTERONE_BLOCKERS VASODIALATORS DIURETICS CALCIUM_BLOCKERS STATINS INSULIN_MEDS ASPIRIN WARFARIN DOACS PRETERM_17P MEDROL PREDNISONE INHALED_STEROID_WITH_LABA INHALED_STEROID_WITHOUT_LABA INHALED_STEROIDS ASTHMA_BIOLOGICS SHORT_ACTING_BRONCHO_DIALATORS TNF_INHIBITORS IMMUNOMODULATORS AMINOSALICYLATES CORTICOSTEROIDS ARNI ALLOPURINOL SEIZURE MUSCLERELAXANT DIGOXIN INOTROPES ANTI_ARRHYTHMIC ANTIPLATELET SULFONYLUREA GLP_1_AGONIST THIAZOLIDINEDIONE SGLT2_INHIBITOR DPP4_INHIBITOR ALPHA_GLUCOSIDASE_INHIBITOR AMYLINOMIMETIC RAPID_ACTING_INSULIN SHORT_ACTING_INSULIN INTERMEDIATE_ACTING_INSULIN LONG_ACTING_INSULIN MINOCYCLINE DOXYCYCLINE MELATONIN METHAZOLAMIDE HYDROXYCHLOROQUINE ITTC DMARDS OBESE_HST MORBIDOBESE_HST PH_HST AFIB_HST COPD_HST CHF_HST DIAB_HST CAD_HST OSTEO_HST HTN_HST CANCER_HST LUNG_CANCER_HST OVARIAN_CANCER_HST HEAD_NECK_CANCER_HST BREAST_CANCER_HST ASTHMA_HST GERD_HST FIBROMYALGIA_HST DEPRESSION_HST PSORIATIC_ARTHRITIS_HST RHEUM_ARTHRITIS_HST LUPUS_HST VTVF_HST STROKE_HST VASCULARDISEASE_HST LOWBACKPAIN_HST DVT_HST PE_HST HYPOTHYROIDISM_HST ADRENAL_INSUFFICIENCY_HST INFERTILITY_HST CKD_HST ESRD_HST OBS_SLEEPAPNEA_HST CARDIAC_ARREST_HST HEMO_STROKE_HST MAJOR_BLEED_HST MACULAR_DEGEN_HST ANXIETY_HST HYPERLIPIDEMIA_HST HIV_HST ALZHEIMER_HST COLORECTAL_CANCER_HST ENDOMETRIAL_CANCER_HST GLAUCOMA_HST HIP_PELVIC_FRACTURE_HST BENIGN_PROSTATIC_HYPERPLASIA_HST CIRRHOSIS_HST CIRRHOSIS_HST_1 CHOLESTEROL_CLOSEST HDL_CLOSEST LDL_CLOSEST TRIG_CLOSEST WBC_CLOSEST HGB_CLOSEST URIC_ACID_CLOSEST HCO3_CLOSEST SODIUM_CLOSEST CREATININE_CLOSEST EF_CLOSEST FEV1_CLOSEST EOS_CLOSEST NEUTRO_CLOSEST MONO_CLOSEST BASOPHIL_CLOSEST K_CLOSEST EGFR_CLOSEST TSH_CLOSEST T4_CLOSEST GLUCOSE_CLOSEST HBA1C_CLOSEST ESR_CLOSEST VITAMIN_D_CLOSEST MAGNESIUM_CLOSEST FOLICAC_CLOSEST VIT_B12_CLOSEST BNP_CLOSEST PLATELET_CLOSEST PA_PRESSURE_CLOSEST HEMATOCRIT_CLOSEST ALBUMIN_CLOSEST PREALBUMIN_CLOSEST MR_CLOSEST TR_CLOSEST MEANPLATELETVOL_CLOSEST MCH_CLOSEST RDW_CLOSEST MCV_CLOSEST MCHC_CLOSEST RBC_CLOSEST LYMPHOCYTE_CLOSEST CA125_CLOSEST BILIRUBIN_CLOSEST ALT_CLOSEST AST_CLOSEST CA_CLOSEST PHOSPHORUS_CLOSEST URINEPROTEIN_CLOSEST TOTALPREVIOUSHOSPVISITS TOTALPREVIOUSEDVISITS TOTALPREVIOUSPCPVISITS PREVIOUSSPECIALTYVISIT PREVIOUSURGENTCAREVISIT CAV_REC_SEX CAV_REC_LANG CAV_REC_AGE CAV_REC_IPOP CAV_REC_PRIORITY_CODE CAV_REC_DISP_CODE UREA_NITROGEN_MAX_1 UREA_NITROGEN_MIN_1 CALCIUM_MAX_1 CALCIUM_MIN_1 IRON_MAX_1 IRON_MIN_1 GLUCOSE_MAX_1 GLUCOSE_MIN_1 HGB_MAX_1 HGB_MIN_1 HEMATOCRIT_MAX_1 HEMATOCRIT_MIN_1 CHLORIDE_MAX_1 CHLORIDE_MIN_1 SODIUM_MAX_1 SODIUM_MIN_1 CREATININE_MAX_1 CREATININE_MIN_1 CARBON_DIOXIDE_MAX_1 CARBON_DIOXIDE_MIN_1 RBC_MAX_1 RBC_MIN_1 MCV_MAX_1 MCV_MIN_1 MCH_MAX_1 MCH_MIN_1 MCHC_MAX_1 MCHC_MIN_1 ANION_GAP_MAX_1 ANION_GAP_MIN_1 PLATELETS_MAX_1 PLATELETS_MIN_1 WBC_MAX_1 WBC_MIN_1 MEAN_PLATELET_VOLUME_MAX_1 MEAN_PLATELET_VOLUME_MIN_1 EGFR_MAX_1 EGFR_MIN_1 RDW_MAX_1 RDW_MIN_1 BASOPHILS_MAX_1 BASOPHILS_MIN_1 NEUTROPHILS_MAX_1 NEUTROPHILS_MIN_1 LYMPHOCYTES_MAX_1 LYMPHOCYTES_MIN_1 MONOCYTES_MAX_1 MONOCYTES_MIN_1 EOSINOPHILS_MAX_1 EOSINOPHILS_MIN_1 MAGNESIUM_MAX_1 MAGNESIUM_MIN_1 PHOSPHORUS_MAX_1 PHOSPHORUS_MIN_1 INR_MAX_1 INR_MIN_1 ALBUMIN_MAX_1 ALBUMIN_MIN_1 TOTAL_BILIRUBIN_MAX_1 TOTAL_BILIRUBIN_MIN_1 AST_MAX_1 AST_MIN_1 ALT_MAX_1 ALT_MIN_1 ALKALINE_PHOSPHATASE_MAX_1 ALKALINE_PHOSPHATASE_MIN_1 TOTAL_PROTEIN_MAX_1 TOTAL_PROTEIN_MIN_1 BUN_CREATININE_RATIO_MAX_1 ACTIVATED_PTT_MAX_1 BUN_CREATININE_RATIO_MIN_1 ACTIVATED_PTT_MIN_1 TROPONIN_I_MAX_1 TROPONIN_I_MIN_1 SPECIFIC_GRAVITY_URINE_MAX_1 SPECIFIC_GRAVITY_URINE_MIN_1 PROTEIN_URINE_MAX_1 PROTEIN_URINE_MIN_1 PH_URINE_MAX_1 PH_URINE_MIN_1 KETONES_URINE_MAX_1 KETONES_URINE_MIN_1 URINE_NITRITE_MAX_1 URINE_NITRITE_MIN_1 LEUKOCYTE_ESTERASE_MAX_1 LEUKOCYTE_ESTERASE_MIN_1 BLOOD_URINE_MAX_1 BLOOD_URINE_MIN_1 BILIRUBIN_URINE_MAX_1 BILIRUBIN_URINE_MIN_1 UROBILINOGEN_URINE_MAX_1 UROBILINOGEN_URINE_MIN_1 WHITE_BLOOD_CELLS_URINE_MAX_1 WHITE_BLOOD_CELLS_URINE_MIN_1 RED_BLOOD_CELLS_URINE_MAX_1 RED_BLOOD_CELLS_URINE_MIN_1 CALCULATED_OSMOLALITY_MAX_1 CALCULATED_OSMOLALITY_MIN_1 DIRECT_BILIRUBIN_MAX_1 DIRECT_BILIRUBIN_MIN_1 LACTATE_BLOOD_MAX_1 LACTATE_BLOOD_MIN_1 BACTERIA_MAX_1 BACTERIA_MIN_1 EPITHELIAL_CELLS_MAX_1 EPITHELIAL_CELLS_MIN_1 AG_RATIO_MAX_1 AG_RATIO_MIN_1 PCO2_ARTERIAL_MAX_1 PCO2_ARTERIAL_MIN_1 ADI_2015
36116 19 2 2 8 1404 1 63.1 16.47 1488.0 158.0 90.0 115.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 NaN NaN NaN NaN 2.7 9.4 NaN 26.1 137.0 0.70 NaN NaN NaN NaN NaN NaN 4.3 60.0 NaN NaN 109.0 NaN NaN NaN 2.0 NaN NaN NaN 289.0 NaN 30.9 4.0 NaN NaN NaN 8.7 20.4 25.5 67.0 30.4 4.61 NaN NaN 0.4 32.0 56.0 10.9 3.7 1.0 1.0 NaN NaN 2.0 NaN 0 13 63.0 3 3 14 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 79.080879
13231 29 17 4 12 1015 2 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 1.0 NaN NaN NaN 1 13 75.0 1 3 30 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 80.571579
56836 29 16 2 12 825 1 74.7 31.01 3360.0 110.0 54.0 99.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 1.0 1.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 1.0 0.0 0.0 1.0 1.0 0.0 0.0 0.0 1.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 110.0 24.0 45.0 203.0 4.6 10.0 NaN 25.0 138.0 2.01 60.0 NaN 2.5 65.1 8.2 0.4 4.7 24.0 2.08 NaN 177.0 8.2 NaN 18.0 2.0 20.0 1710.0 NaN 84.0 37.0 30.1 3.4 5.96 2.0 1.0 9.1 30.2 15.8 90.9 33.3 3.31 23.8 NaN 0.4 21.0 37.0 9.1 2.7 3.0 4.0 5.0 19.0 23.0 NaN 0 13 74.0 1 3 30 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 80.571579
61971 35 17 4 14 673 2 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 1 13 43.0 2 15 15 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 50.716464
1532 19 2 2 8 1336 0 75.3 37.84 4160.0 135.0 90.0 86.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 1.0 0.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 1.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0 0.0 0.0 NaN NaN NaN NaN 6.2 11.9 NaN 29.0 145.0 1.21 NaN NaN 3.9 66.4 10.4 0.2 3.9 58.0 NaN NaN 92.0 NaN NaN NaN NaN NaN NaN NaN 146.0 NaN 36.9 3.1 NaN NaN NaN 8.5 28.0 13.4 84.5 33.1 4.37 19.1 NaN 0.7 20.0 18.0 8.6 NaN NaN NaN NaN NaN 6.0 NaN 1 13 74.0 4 3 30 NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN 62.224959
In [39]:
dtrain = xgb.DMatrix(data = x_train, label = y_train)
dval = xgb.DMatrix(data = x_val, label = y_val)
dtest = xgb.DMatrix(data = x_test, label = y_test)
In [40]:
# Manual tuning of Hyperparameters
param = {'max_depth':5,
         'eta': 0.30,
         'silent':1,
         'objective':'binary:logistic',
         'eval_metric': 'auc'
         ,'scale_pos_weight' : 1
         ,'maximize' : 'TRUE'
         ,'n_jobs' : -1,
         'learning_rate':0.01,
         'n_estimators':100,
         'min_child_weight':1,
         'gamma':0,
         'subsample':0.8,
         'colsample_bytree':0.8,
         'nthread':4,
         'seed':27
        }
In [41]:
# Specify validations set to watch performance
watchlist = [(dtrain, 'train'), (dval, 'eval')]
num_round = 100 # This is another hyperparameter of sorts
bst = xgb.train(param, dtrain, num_round, watchlist, early_stopping_rounds = 10)
[0]	train-auc:0.776129	eval-auc:0.763516
Multiple eval metrics have been passed: 'eval-auc' will be used for early stopping.

Will train until eval-auc hasn't improved in 10 rounds.
[1]	train-auc:0.789714	eval-auc:0.774393
[2]	train-auc:0.792951	eval-auc:0.777735
[3]	train-auc:0.795481	eval-auc:0.780521
[4]	train-auc:0.796489	eval-auc:0.782233
[5]	train-auc:0.796962	eval-auc:0.783268
[6]	train-auc:0.797464	eval-auc:0.783422
[7]	train-auc:0.798131	eval-auc:0.783883
[8]	train-auc:0.803143	eval-auc:0.789052
[9]	train-auc:0.80542	eval-auc:0.790752
[10]	train-auc:0.805598	eval-auc:0.790337
[11]	train-auc:0.805459	eval-auc:0.789826
[12]	train-auc:0.805373	eval-auc:0.789763
[13]	train-auc:0.805906	eval-auc:0.790582
[14]	train-auc:0.806623	eval-auc:0.791464
[15]	train-auc:0.806462	eval-auc:0.791361
[16]	train-auc:0.807063	eval-auc:0.791942
[17]	train-auc:0.807349	eval-auc:0.791892
[18]	train-auc:0.8072	eval-auc:0.79153
[19]	train-auc:0.807311	eval-auc:0.791737
[20]	train-auc:0.807328	eval-auc:0.791827
[21]	train-auc:0.807482	eval-auc:0.792242
[22]	train-auc:0.807977	eval-auc:0.79253
[23]	train-auc:0.807762	eval-auc:0.792232
[24]	train-auc:0.807749	eval-auc:0.792105
[25]	train-auc:0.807771	eval-auc:0.7922
[26]	train-auc:0.808576	eval-auc:0.793082
[27]	train-auc:0.80872	eval-auc:0.793098
[28]	train-auc:0.808838	eval-auc:0.793356
[29]	train-auc:0.809322	eval-auc:0.793855
[30]	train-auc:0.809186	eval-auc:0.793565
[31]	train-auc:0.809081	eval-auc:0.793465
[32]	train-auc:0.809195	eval-auc:0.793632
[33]	train-auc:0.809146	eval-auc:0.793597
[34]	train-auc:0.809175	eval-auc:0.793566
[35]	train-auc:0.809113	eval-auc:0.793395
[36]	train-auc:0.809338	eval-auc:0.793288
[37]	train-auc:0.809386	eval-auc:0.793291
[38]	train-auc:0.809281	eval-auc:0.793159
[39]	train-auc:0.80917	eval-auc:0.792939
Stopping. Best iteration:
[29]	train-auc:0.809322	eval-auc:0.793855

In [42]:
# Importing Hyperopt
import hyperopt as hp
import lightgbm as lgb
from hyperopt import Trials,fmin,STATUS_OK
/Users/piumallick/anaconda3/lib/python3.7/site-packages/lightgbm/__init__.py:48: UserWarning: Starting from version 2.2.1, the library file in distribution wheels for macOS is built by the Apple Clang (Xcode_8.3.3) compiler.
This means that in case of installing LightGBM from PyPI via the ``pip install lightgbm`` command, you don't need to install the gcc compiler anymore.
Instead of that, you need to install the OpenMP library, which is required for running LightGBM on the system with the Apple Clang compiler.
You can install the OpenMP library by the following command: ``brew install libomp``.
  "You can install the OpenMP library by the following command: ``brew install libomp``.", UserWarning)
In [43]:
dtrain = xgb.DMatrix(data = x_train, label = y_train)
dval = xgb.DMatrix(data = x_val, label = y_val)
dtest = xgb.DMatrix(data = x_test, label = y_test)
In [44]:
# Sets the space to search over and the prior probabilities over the search space 
xgb_space = {
    'booster': hp.hp.choice('booster',  ['gbtree']),
    'eta': hp.hp.loguniform('learning_rate', -4, 0),
    'max_depth':hp.hp.choice('max_depth', np.arange(10, 300,1, dtype=int)),
    'subsample':hp.hp.quniform('subsample',0.5,1.0,0.05),
    'colsample_bytree':hp.hp.quniform('colsample_bytree',0.5,1.0,0.05),
    'min_child_weight':hp.hp.quniform('min_child_weight', 100, 1000,100),
    'lambda': hp.hp.uniform('reg_alpha', 0.0, 1000.0),
    'alpha': hp.hp.uniform('reg_lambda', 0.0, 1000.0),
    'gamma': hp.hp.uniform('reg_gamma', 0.0, 1000.0),
    'scale_pos_weight': hp.hp.uniform('scale_pos_weight', 1.0, 10.0),
    'eval_metric' : hp.hp.choice('eval_metric', ['auc']),
    'n_thread': hp.hp.choice('n_thread', [-1]),
    'verbose' : hp.hp.choice('verbose', [-1]),
    'maximize' : hp.hp.choice('maximize', ['TRUE'])
    }
In [45]:
def objective_m(params, n_folds=5):

    model = xgb.cv(params = params,
              dtrain = dtrain,
              num_boost_round = 10,
              early_stopping_rounds = 10,
             nfold = n_folds)

    # Returns the best average loss on validation set 
    
    loss = 1 - max(model['test-auc-mean'])
    return loss

bayes_trials = Trials()
MAX_EVALS = 100 # this controls the runtime 

xgb_best_m = fmin(fn = objective_m, space = xgb_space, algo = hp.tpe.suggest, max_evals = MAX_EVALS, trials = bayes_trials)
100%|██████████| 100/100 [50:53<00:00, 30.54s/trial, best loss: 0.18500700000000003]
In [46]:
# XGBoost Best Parameters
xgb_best_m
Out[46]:
{'booster': 0,
 'colsample_bytree': 0.6000000000000001,
 'eval_metric': 0,
 'learning_rate': 0.28957411872608396,
 'max_depth': 148,
 'maximize': 0,
 'min_child_weight': 200.0,
 'n_thread': 0,
 'reg_alpha': 566.0477313007634,
 'reg_gamma': 6.212005945800067,
 'reg_lambda': 37.57823272056038,
 'scale_pos_weight': 9.941822730350838,
 'subsample': 1.0,
 'verbose': 0}
In [47]:
# By HyperOpt -- Best Result
param = {'booster': 'gbtree',
 'colsample_bytree': 0.95,
 'eval_metric': 'auc',
 'objective':'binary:logistic',
 'learning_rate': 0.32,
 'max_depth': 152,
 'maximize': 0,
 'min_child_weight': 700.0,
 'n_thread': 0,
 'alpha': 105,
 'lambda': 140,
 'scale_pos_weight': 7,
 'subsample': 0.65,
 'verbose': 0}
In [48]:
# Specify validations set to watch performance
watchlist = [(dtrain, 'train'), (dval, 'eval')]
num_round = 100 #This is another hyperparameter of sorts
bst = xgb.train(param, dtrain, num_round, watchlist, early_stopping_rounds = 10)
[0]	train-auc:0.750247	eval-auc:0.734134
Multiple eval metrics have been passed: 'eval-auc' will be used for early stopping.

Will train until eval-auc hasn't improved in 10 rounds.
[1]	train-auc:0.768602	eval-auc:0.75546
[2]	train-auc:0.777298	eval-auc:0.764766
[3]	train-auc:0.780482	eval-auc:0.768677
[4]	train-auc:0.788189	eval-auc:0.774307
[5]	train-auc:0.798293	eval-auc:0.78277
[6]	train-auc:0.80188	eval-auc:0.786236
[7]	train-auc:0.806475	eval-auc:0.790884
[8]	train-auc:0.809088	eval-auc:0.79414
[9]	train-auc:0.812687	eval-auc:0.797245
[10]	train-auc:0.819248	eval-auc:0.804344
[11]	train-auc:0.821116	eval-auc:0.806352
[12]	train-auc:0.822314	eval-auc:0.807379
[13]	train-auc:0.823275	eval-auc:0.807556
[14]	train-auc:0.823661	eval-auc:0.808075
[15]	train-auc:0.824568	eval-auc:0.8087
[16]	train-auc:0.826794	eval-auc:0.810764
[17]	train-auc:0.827364	eval-auc:0.811007
[18]	train-auc:0.828002	eval-auc:0.811441
[19]	train-auc:0.830459	eval-auc:0.813992
[20]	train-auc:0.831098	eval-auc:0.814504
[21]	train-auc:0.831525	eval-auc:0.814437
[22]	train-auc:0.83211	eval-auc:0.814736
[23]	train-auc:0.832457	eval-auc:0.814922
[24]	train-auc:0.832882	eval-auc:0.815047
[25]	train-auc:0.833309	eval-auc:0.815069
[26]	train-auc:0.833657	eval-auc:0.815326
[27]	train-auc:0.834245	eval-auc:0.816033
[28]	train-auc:0.835444	eval-auc:0.817141
[29]	train-auc:0.835672	eval-auc:0.817445
[30]	train-auc:0.836163	eval-auc:0.817744
[31]	train-auc:0.836563	eval-auc:0.817911
[32]	train-auc:0.836909	eval-auc:0.818021
[33]	train-auc:0.837407	eval-auc:0.818182
[34]	train-auc:0.837786	eval-auc:0.818375
[35]	train-auc:0.83805	eval-auc:0.818371
[36]	train-auc:0.838208	eval-auc:0.818598
[37]	train-auc:0.838603	eval-auc:0.818668
[38]	train-auc:0.839038	eval-auc:0.818943
[39]	train-auc:0.839205	eval-auc:0.818863
[40]	train-auc:0.839533	eval-auc:0.818978
[41]	train-auc:0.839833	eval-auc:0.819102
[42]	train-auc:0.840136	eval-auc:0.819068
[43]	train-auc:0.840643	eval-auc:0.819963
[44]	train-auc:0.840896	eval-auc:0.820288
[45]	train-auc:0.841143	eval-auc:0.820282
[46]	train-auc:0.841301	eval-auc:0.820088
[47]	train-auc:0.841508	eval-auc:0.820327
[48]	train-auc:0.841891	eval-auc:0.820483
[49]	train-auc:0.842264	eval-auc:0.820688
[50]	train-auc:0.842753	eval-auc:0.821047
[51]	train-auc:0.843012	eval-auc:0.821326
[52]	train-auc:0.843475	eval-auc:0.821809
[53]	train-auc:0.843647	eval-auc:0.821809
[54]	train-auc:0.843993	eval-auc:0.821965
[55]	train-auc:0.844328	eval-auc:0.822314
[56]	train-auc:0.844767	eval-auc:0.822619
[57]	train-auc:0.844943	eval-auc:0.822568
[58]	train-auc:0.84506	eval-auc:0.8227
[59]	train-auc:0.845432	eval-auc:0.823101
[60]	train-auc:0.846004	eval-auc:0.823802
[61]	train-auc:0.8461	eval-auc:0.823981
[62]	train-auc:0.846428	eval-auc:0.824154
[63]	train-auc:0.846714	eval-auc:0.824383
[64]	train-auc:0.84694	eval-auc:0.82453
[65]	train-auc:0.847183	eval-auc:0.824576
[66]	train-auc:0.847405	eval-auc:0.824686
[67]	train-auc:0.847496	eval-auc:0.824805
[68]	train-auc:0.847659	eval-auc:0.824766
[69]	train-auc:0.84776	eval-auc:0.824636
[70]	train-auc:0.847883	eval-auc:0.824718
[71]	train-auc:0.847966	eval-auc:0.824756
[72]	train-auc:0.848136	eval-auc:0.82471
[73]	train-auc:0.848388	eval-auc:0.824967
[74]	train-auc:0.848602	eval-auc:0.825231
[75]	train-auc:0.848719	eval-auc:0.82506
[76]	train-auc:0.849037	eval-auc:0.82528
[77]	train-auc:0.849218	eval-auc:0.825317
[78]	train-auc:0.849363	eval-auc:0.825441
[79]	train-auc:0.849579	eval-auc:0.825574
[80]	train-auc:0.849874	eval-auc:0.825767
[81]	train-auc:0.85001	eval-auc:0.825781
[82]	train-auc:0.85041	eval-auc:0.825986
[83]	train-auc:0.850564	eval-auc:0.826049
[84]	train-auc:0.850804	eval-auc:0.826122
[85]	train-auc:0.85092	eval-auc:0.826077
[86]	train-auc:0.851173	eval-auc:0.826208
[87]	train-auc:0.851326	eval-auc:0.826316
[88]	train-auc:0.85143	eval-auc:0.826347
[89]	train-auc:0.851641	eval-auc:0.826414
[90]	train-auc:0.851851	eval-auc:0.826466
[91]	train-auc:0.852086	eval-auc:0.826568
[92]	train-auc:0.85222	eval-auc:0.826607
[93]	train-auc:0.852509	eval-auc:0.826612
[94]	train-auc:0.852626	eval-auc:0.826425
[95]	train-auc:0.85275	eval-auc:0.826522
[96]	train-auc:0.852838	eval-auc:0.826598
[97]	train-auc:0.853025	eval-auc:0.826585
[98]	train-auc:0.853122	eval-auc:0.826535
[99]	train-auc:0.853308	eval-auc:0.826586
In [49]:
# Less Accuracy but minimum difference between Train and Validation
param = {'booster': 'gbtree',
 'colsample_bytree': 1.0,
 'eval_metric': 'auc',
 'objective':'binary:logistic',
 'learning_rate': 0.20,
 'max_depth': 71,
 'maximize': 'TRUE',
 'min_child_weight': 500.0,
 'n_thread': -1,
 'alpha': 913,
 'gamma': 3,
 'lambda': 25,
 'scale_pos_weight': 4,
 'subsample': 1.0,
 'verbose': -1}
In [50]:
# specify validations set to watch performance
watchlist = [(dtrain, 'train'), (dval, 'eval')]
num_round = 100 # This is another hyperparameter of sorts
bst = xgb.train(param, dtrain, num_round, watchlist, early_stopping_rounds = 10)
[0]	train-auc:0.742685	eval-auc:0.733831
Multiple eval metrics have been passed: 'eval-auc' will be used for early stopping.

Will train until eval-auc hasn't improved in 10 rounds.
[1]	train-auc:0.749262	eval-auc:0.741844
[2]	train-auc:0.757535	eval-auc:0.749857
[3]	train-auc:0.762538	eval-auc:0.755563
[4]	train-auc:0.767351	eval-auc:0.757893
[5]	train-auc:0.769365	eval-auc:0.759817
[6]	train-auc:0.774609	eval-auc:0.76486
[7]	train-auc:0.779298	eval-auc:0.769379
[8]	train-auc:0.783731	eval-auc:0.773162
[9]	train-auc:0.786561	eval-auc:0.7758
[10]	train-auc:0.788161	eval-auc:0.777829
[11]	train-auc:0.790517	eval-auc:0.779625
[12]	train-auc:0.792214	eval-auc:0.780711
[13]	train-auc:0.792344	eval-auc:0.780796
[14]	train-auc:0.793773	eval-auc:0.782398
[15]	train-auc:0.794785	eval-auc:0.783252
[16]	train-auc:0.795298	eval-auc:0.783887
[17]	train-auc:0.796214	eval-auc:0.784477
[18]	train-auc:0.79682	eval-auc:0.784738
[19]	train-auc:0.798087	eval-auc:0.785503
[20]	train-auc:0.798791	eval-auc:0.786014
[21]	train-auc:0.798807	eval-auc:0.786079
[22]	train-auc:0.799344	eval-auc:0.786369
[23]	train-auc:0.799596	eval-auc:0.786814
[24]	train-auc:0.800396	eval-auc:0.787385
[25]	train-auc:0.80088	eval-auc:0.787764
[26]	train-auc:0.801292	eval-auc:0.78804
[27]	train-auc:0.801266	eval-auc:0.788071
[28]	train-auc:0.80156	eval-auc:0.788074
[29]	train-auc:0.801943	eval-auc:0.788384
[30]	train-auc:0.802119	eval-auc:0.788425
[31]	train-auc:0.80228	eval-auc:0.788728
[32]	train-auc:0.803249	eval-auc:0.789648
[33]	train-auc:0.80362	eval-auc:0.789989
[34]	train-auc:0.80374	eval-auc:0.790171
[35]	train-auc:0.803722	eval-auc:0.790204
[36]	train-auc:0.80403	eval-auc:0.790358
[37]	train-auc:0.80403	eval-auc:0.790358
[38]	train-auc:0.80403	eval-auc:0.790358
[39]	train-auc:0.80403	eval-auc:0.790358
[40]	train-auc:0.80403	eval-auc:0.790358
[41]	train-auc:0.80403	eval-auc:0.790358
[42]	train-auc:0.80403	eval-auc:0.790358
[43]	train-auc:0.80403	eval-auc:0.790358
[44]	train-auc:0.80403	eval-auc:0.790358
[45]	train-auc:0.80403	eval-auc:0.790358
[46]	train-auc:0.80403	eval-auc:0.790358
Stopping. Best iteration:
[36]	train-auc:0.80403	eval-auc:0.790358

In [51]:
# Importing Hyperband
import hyperband
from hyperband import HyperbandSearchCV

#Hyperband scores models using scikit-learn:
import sklearn
sklearn.metrics.SCORERS.keys()
Out[51]:
dict_keys(['explained_variance', 'r2', 'max_error', 'neg_median_absolute_error', 'neg_mean_absolute_error', 'neg_mean_squared_error', 'neg_mean_squared_log_error', 'neg_root_mean_squared_error', 'neg_mean_poisson_deviance', 'neg_mean_gamma_deviance', 'accuracy', 'roc_auc', 'roc_auc_ovr', 'roc_auc_ovo', 'roc_auc_ovr_weighted', 'roc_auc_ovo_weighted', 'balanced_accuracy', 'average_precision', 'neg_log_loss', 'neg_brier_score', 'adjusted_rand_score', 'homogeneity_score', 'completeness_score', 'v_measure_score', 'mutual_info_score', 'adjusted_mutual_info_score', 'normalized_mutual_info_score', 'fowlkes_mallows_score', 'precision', 'precision_macro', 'precision_micro', 'precision_samples', 'precision_weighted', 'recall', 'recall_macro', 'recall_micro', 'recall_samples', 'recall_weighted', 'f1', 'f1_macro', 'f1_micro', 'f1_samples', 'f1_weighted', 'jaccard', 'jaccard_macro', 'jaccard_micro', 'jaccard_samples', 'jaccard_weighted'])
In [52]:
hb_xgb_model = xgb.XGBClassifier()

xgb_hb_param_dict = {'max_depth' : np.arange(70, 160),
                    'learning_rate' : np.arange(0.001),
                    'n_estimators' : np.arange(50, 200),
                    'objective' : ['binary:logistic'],
                    #'gamma' : [],
                    #'min_child_weight' : [],
                    #'max_delta_step' : [],
                    #'subsample' : [],
                    #'colsample_by_tree' : [],
                    #'colsample_by_level' : [],
                    #'colsample_by_node' : [],
                    #'reg_alpha' : [],
                    #'reg_lambda' : [],
                    #'scale_pos_weight' : []
                    }
In [53]:
xgb_search = HyperbandSearchCV(hb_xgb_model, xgb_hb_param_dict, cv=3,
                           verbose = 1,
                           max_iter=20,min_iter=5,
                           scoring='roc_auc')
In [54]:
xgb_search.fit(x_train,y_train)
Starting bracket 1 (out of 2) of hyperband
Starting successive halving iteration 1 out of 2. Fitting 3 configurations, with resource_param n_estimators set to 6, and keeping the best 1 configurations.
Fitting 3 folds for each of 3 candidates, totalling 9 fits
[Parallel(n_jobs=1)]: Using backend SequentialBackend with 1 concurrent workers.
[Parallel(n_jobs=1)]: Done   9 out of   9 | elapsed:  2.0min finished
/Users/piumallick/anaconda3/lib/python3.7/site-packages/sklearn/model_selection/_search.py:823: FutureWarning: The parameter 'iid' is deprecated in 0.22 and will be removed in 0.24.
  "removed in 0.24.", FutureWarning
[Parallel(n_jobs=1)]: Using backend SequentialBackend with 1 concurrent workers.
Starting successive halving iteration 2 out of 2. Fitting 1 configurations, with resource_param n_estimators set to 20
Fitting 3 folds for each of 1 candidates, totalling 3 fits
[Parallel(n_jobs=1)]: Done   3 out of   3 | elapsed:  2.1min finished
/Users/piumallick/anaconda3/lib/python3.7/site-packages/sklearn/model_selection/_search.py:823: FutureWarning: The parameter 'iid' is deprecated in 0.22 and will be removed in 0.24.
  "removed in 0.24.", FutureWarning
[Parallel(n_jobs=1)]: Using backend SequentialBackend with 1 concurrent workers.
Starting bracket 2 (out of 2) of hyperband
Starting successive halving iteration 1 out of 1. Fitting 2 configurations, with resource_param n_estimators set to 20
Fitting 3 folds for each of 2 candidates, totalling 6 fits
[Parallel(n_jobs=1)]: Done   6 out of   6 | elapsed:  4.1min finished
/Users/piumallick/anaconda3/lib/python3.7/site-packages/sklearn/model_selection/_search.py:823: FutureWarning: The parameter 'iid' is deprecated in 0.22 and will be removed in 0.24.
  "removed in 0.24.", FutureWarning
Out[54]:
HyperbandSearchCV(cv=3, error_score='raise',
                  estimator=XGBClassifier(base_score=0.5, booster='gbtree',
                                          colsample_bylevel=1,
                                          colsample_bynode=1,
                                          colsample_bytree=1, gamma=0,
                                          learning_rate=0.1, max_delta_step=0,
                                          max_depth=3, min_child_weight=1,
                                          missing=None, n_estimators=100,
                                          n_jobs=1, nthread=None,
                                          objective='binary:logistic',
                                          random_state=0, reg_alpha=0,
                                          reg_lambda=1...
       154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166,
       167, 168, 169, 170, 171, 172, 173, 174, 175, 176, 177, 178, 179,
       180, 181, 182, 183, 184, 185, 186, 187, 188, 189, 190, 191, 192,
       193, 194, 195, 196, 197, 198, 199]),
                                       'objective': ['binary:logistic']},
                  pre_dispatch='2*n_jobs', random_state=None, refit=True,
                  resource_param='n_estimators', return_train_score=False,
                  scoring='roc_auc', skip_last=0, verbose=1)
In [55]:
xgb_search.best_score_
Out[55]:
0.5

So, by using XGBoost (with Hyperopt), we get the Train AUC as 0.853308 and Validation AUC as 0.826586.

The difference of Train and Validation accuracy is approximately 0.026722 and we can say that the model is also fit-fine.

Now, let's check what results we obtain with LightGBM, so that we can finally decide on which model to finally choose.

Choosing the 3rd model:

3. LightGBM Model

In [56]:
# Importing LightGBM
import lightgbm as lgb
In [57]:
# LightGBM Manual Parameter Tuning
lgb_params = {
    'boosting_type': 'gbdt',
    'objective': 'binary',
    'metric': 'auc',
    'max_depth' : 3,
    'num_leaves' : 51,
    'learning_rate': 0.1,
    'num_threads' : -1,
    'scale_pos_weight' : 1.1,
    'early_stopping_round' : 20,
    #'top_rate' : 0.6,
    #'other_rate' : 0.05,
    'lambda_l1' : 20,
    'lambda_l2' : 0.09951595323870244
}
In [58]:
lgb_proc_train_rh = lgb.Dataset(x_train, y_train)
lgb_proc_val_rh = lgb.Dataset(x_val, y_val)
In [59]:
gbm_rh = lgb.train(params = lgb_params, train_set = lgb_proc_train_rh,
                num_boost_round = 5000, valid_sets = [lgb_proc_val_rh, lgb_proc_train_rh],
               valid_names = ['Evaluation', 'Train'])
/Users/piumallick/anaconda3/lib/python3.7/site-packages/lightgbm/engine.py:153: UserWarning: Found `early_stopping_round` in params. Will use it instead of argument
  warnings.warn("Found `{}` in params. Will use it instead of argument".format(alias))
[1]	Train's auc: 0.746099	Evaluation's auc: 0.737671
Training until validation scores don't improve for 20 rounds
[2]	Train's auc: 0.759424	Evaluation's auc: 0.752346
[3]	Train's auc: 0.760165	Evaluation's auc: 0.752714
[4]	Train's auc: 0.765161	Evaluation's auc: 0.759173
[5]	Train's auc: 0.764795	Evaluation's auc: 0.758286
[6]	Train's auc: 0.766336	Evaluation's auc: 0.758787
[7]	Train's auc: 0.768518	Evaluation's auc: 0.760997
[8]	Train's auc: 0.773839	Evaluation's auc: 0.765187
[9]	Train's auc: 0.776775	Evaluation's auc: 0.767291
[10]	Train's auc: 0.779615	Evaluation's auc: 0.769877
[11]	Train's auc: 0.785295	Evaluation's auc: 0.775861
[12]	Train's auc: 0.787069	Evaluation's auc: 0.776502
[13]	Train's auc: 0.788873	Evaluation's auc: 0.778081
[14]	Train's auc: 0.790106	Evaluation's auc: 0.779426
[15]	Train's auc: 0.792344	Evaluation's auc: 0.781349
[16]	Train's auc: 0.794798	Evaluation's auc: 0.783194
[17]	Train's auc: 0.795146	Evaluation's auc: 0.783768
[18]	Train's auc: 0.796552	Evaluation's auc: 0.785056
[19]	Train's auc: 0.797846	Evaluation's auc: 0.785809
[20]	Train's auc: 0.798451	Evaluation's auc: 0.785843
[21]	Train's auc: 0.799177	Evaluation's auc: 0.786626
[22]	Train's auc: 0.799487	Evaluation's auc: 0.786859
[23]	Train's auc: 0.800682	Evaluation's auc: 0.787653
[24]	Train's auc: 0.801601	Evaluation's auc: 0.788118
[25]	Train's auc: 0.802831	Evaluation's auc: 0.789296
[26]	Train's auc: 0.803575	Evaluation's auc: 0.789936
[27]	Train's auc: 0.805655	Evaluation's auc: 0.792076
[28]	Train's auc: 0.806132	Evaluation's auc: 0.792355
[29]	Train's auc: 0.806482	Evaluation's auc: 0.792796
[30]	Train's auc: 0.807828	Evaluation's auc: 0.794053
[31]	Train's auc: 0.808449	Evaluation's auc: 0.794537
[32]	Train's auc: 0.80917	Evaluation's auc: 0.795125
[33]	Train's auc: 0.811215	Evaluation's auc: 0.797431
[34]	Train's auc: 0.811662	Evaluation's auc: 0.797655
[35]	Train's auc: 0.812173	Evaluation's auc: 0.798341
[36]	Train's auc: 0.812404	Evaluation's auc: 0.798492
[37]	Train's auc: 0.812905	Evaluation's auc: 0.798779
[38]	Train's auc: 0.8136	Evaluation's auc: 0.799331
[39]	Train's auc: 0.814277	Evaluation's auc: 0.800107
[40]	Train's auc: 0.814674	Evaluation's auc: 0.800668
[41]	Train's auc: 0.81483	Evaluation's auc: 0.800638
[42]	Train's auc: 0.815771	Evaluation's auc: 0.801351
[43]	Train's auc: 0.816582	Evaluation's auc: 0.801897
[44]	Train's auc: 0.816672	Evaluation's auc: 0.801986
[45]	Train's auc: 0.817484	Evaluation's auc: 0.802736
[46]	Train's auc: 0.817676	Evaluation's auc: 0.802863
[47]	Train's auc: 0.819231	Evaluation's auc: 0.804287
[48]	Train's auc: 0.819691	Evaluation's auc: 0.804552
[49]	Train's auc: 0.820377	Evaluation's auc: 0.805087
[50]	Train's auc: 0.820626	Evaluation's auc: 0.805446
[51]	Train's auc: 0.82176	Evaluation's auc: 0.806854
[52]	Train's auc: 0.822943	Evaluation's auc: 0.80789
[53]	Train's auc: 0.823333	Evaluation's auc: 0.808271
[54]	Train's auc: 0.823643	Evaluation's auc: 0.808414
[55]	Train's auc: 0.824503	Evaluation's auc: 0.809417
[56]	Train's auc: 0.825379	Evaluation's auc: 0.810144
[57]	Train's auc: 0.825869	Evaluation's auc: 0.810496
[58]	Train's auc: 0.826524	Evaluation's auc: 0.811021
[59]	Train's auc: 0.826686	Evaluation's auc: 0.81112
[60]	Train's auc: 0.827675	Evaluation's auc: 0.812095
[61]	Train's auc: 0.827915	Evaluation's auc: 0.812325
[62]	Train's auc: 0.828142	Evaluation's auc: 0.812488
[63]	Train's auc: 0.828342	Evaluation's auc: 0.812624
[64]	Train's auc: 0.828954	Evaluation's auc: 0.813222
[65]	Train's auc: 0.829283	Evaluation's auc: 0.813587
[66]	Train's auc: 0.829948	Evaluation's auc: 0.814154
[67]	Train's auc: 0.830112	Evaluation's auc: 0.814217
[68]	Train's auc: 0.830328	Evaluation's auc: 0.814298
[69]	Train's auc: 0.830511	Evaluation's auc: 0.814463
[70]	Train's auc: 0.830812	Evaluation's auc: 0.814833
[71]	Train's auc: 0.831285	Evaluation's auc: 0.815248
[72]	Train's auc: 0.831702	Evaluation's auc: 0.815614
[73]	Train's auc: 0.832003	Evaluation's auc: 0.815871
[74]	Train's auc: 0.832541	Evaluation's auc: 0.816368
[75]	Train's auc: 0.83275	Evaluation's auc: 0.81654
[76]	Train's auc: 0.832866	Evaluation's auc: 0.816599
[77]	Train's auc: 0.833073	Evaluation's auc: 0.816755
[78]	Train's auc: 0.833581	Evaluation's auc: 0.817361
[79]	Train's auc: 0.833926	Evaluation's auc: 0.817808
[80]	Train's auc: 0.834038	Evaluation's auc: 0.817918
[81]	Train's auc: 0.834469	Evaluation's auc: 0.818319
[82]	Train's auc: 0.834611	Evaluation's auc: 0.818393
[83]	Train's auc: 0.834958	Evaluation's auc: 0.818871
[84]	Train's auc: 0.83513	Evaluation's auc: 0.81899
[85]	Train's auc: 0.835311	Evaluation's auc: 0.819194
[86]	Train's auc: 0.835664	Evaluation's auc: 0.819482
[87]	Train's auc: 0.835912	Evaluation's auc: 0.819569
[88]	Train's auc: 0.836048	Evaluation's auc: 0.81959
[89]	Train's auc: 0.836181	Evaluation's auc: 0.819584
[90]	Train's auc: 0.836381	Evaluation's auc: 0.819694
[91]	Train's auc: 0.836594	Evaluation's auc: 0.819858
[92]	Train's auc: 0.8369	Evaluation's auc: 0.820299
[93]	Train's auc: 0.837207	Evaluation's auc: 0.8206
[94]	Train's auc: 0.837393	Evaluation's auc: 0.820827
[95]	Train's auc: 0.837543	Evaluation's auc: 0.82096
[96]	Train's auc: 0.837845	Evaluation's auc: 0.821236
[97]	Train's auc: 0.837973	Evaluation's auc: 0.821303
[98]	Train's auc: 0.838349	Evaluation's auc: 0.821646
[99]	Train's auc: 0.838596	Evaluation's auc: 0.82182
[100]	Train's auc: 0.838682	Evaluation's auc: 0.821898
[101]	Train's auc: 0.838937	Evaluation's auc: 0.822197
[102]	Train's auc: 0.839088	Evaluation's auc: 0.822333
[103]	Train's auc: 0.839314	Evaluation's auc: 0.822635
[104]	Train's auc: 0.839423	Evaluation's auc: 0.822702
[105]	Train's auc: 0.839575	Evaluation's auc: 0.822706
[106]	Train's auc: 0.83986	Evaluation's auc: 0.822951
[107]	Train's auc: 0.840307	Evaluation's auc: 0.823477
[108]	Train's auc: 0.840404	Evaluation's auc: 0.823507
[109]	Train's auc: 0.840542	Evaluation's auc: 0.823642
[110]	Train's auc: 0.840613	Evaluation's auc: 0.82364
[111]	Train's auc: 0.8407	Evaluation's auc: 0.823623
[112]	Train's auc: 0.840831	Evaluation's auc: 0.823696
[113]	Train's auc: 0.840991	Evaluation's auc: 0.823804
[114]	Train's auc: 0.841058	Evaluation's auc: 0.823837
[115]	Train's auc: 0.841206	Evaluation's auc: 0.823936
[116]	Train's auc: 0.841296	Evaluation's auc: 0.824009
[117]	Train's auc: 0.841396	Evaluation's auc: 0.824046
[118]	Train's auc: 0.841533	Evaluation's auc: 0.824114
[119]	Train's auc: 0.841741	Evaluation's auc: 0.824321
[120]	Train's auc: 0.841851	Evaluation's auc: 0.824338
[121]	Train's auc: 0.842003	Evaluation's auc: 0.824394
[122]	Train's auc: 0.842103	Evaluation's auc: 0.824518
[123]	Train's auc: 0.842214	Evaluation's auc: 0.824526
[124]	Train's auc: 0.84234	Evaluation's auc: 0.82471
[125]	Train's auc: 0.842497	Evaluation's auc: 0.824809
[126]	Train's auc: 0.842593	Evaluation's auc: 0.824882
[127]	Train's auc: 0.842697	Evaluation's auc: 0.824924
[128]	Train's auc: 0.842853	Evaluation's auc: 0.825104
[129]	Train's auc: 0.84294	Evaluation's auc: 0.825146
[130]	Train's auc: 0.843041	Evaluation's auc: 0.825237
[131]	Train's auc: 0.843083	Evaluation's auc: 0.825195
[132]	Train's auc: 0.843211	Evaluation's auc: 0.825252
[133]	Train's auc: 0.84328	Evaluation's auc: 0.825226
[134]	Train's auc: 0.8434	Evaluation's auc: 0.825218
[135]	Train's auc: 0.843506	Evaluation's auc: 0.825329
[136]	Train's auc: 0.843608	Evaluation's auc: 0.825338
[137]	Train's auc: 0.843674	Evaluation's auc: 0.82534
[138]	Train's auc: 0.843891	Evaluation's auc: 0.825604
[139]	Train's auc: 0.84398	Evaluation's auc: 0.825591
[140]	Train's auc: 0.844089	Evaluation's auc: 0.825683
[141]	Train's auc: 0.844276	Evaluation's auc: 0.825838
[142]	Train's auc: 0.844666	Evaluation's auc: 0.826279
[143]	Train's auc: 0.844795	Evaluation's auc: 0.826219
[144]	Train's auc: 0.844846	Evaluation's auc: 0.8262
[145]	Train's auc: 0.844955	Evaluation's auc: 0.826275
[146]	Train's auc: 0.845256	Evaluation's auc: 0.826623
[147]	Train's auc: 0.845374	Evaluation's auc: 0.826687
[148]	Train's auc: 0.845431	Evaluation's auc: 0.826746
[149]	Train's auc: 0.845473	Evaluation's auc: 0.826766
[150]	Train's auc: 0.845584	Evaluation's auc: 0.826814
[151]	Train's auc: 0.845656	Evaluation's auc: 0.826787
[152]	Train's auc: 0.845882	Evaluation's auc: 0.827083
[153]	Train's auc: 0.845977	Evaluation's auc: 0.827197
[154]	Train's auc: 0.846083	Evaluation's auc: 0.827229
[155]	Train's auc: 0.846165	Evaluation's auc: 0.827246
[156]	Train's auc: 0.846424	Evaluation's auc: 0.82743
[157]	Train's auc: 0.8465	Evaluation's auc: 0.827464
[158]	Train's auc: 0.846678	Evaluation's auc: 0.827695
[159]	Train's auc: 0.846806	Evaluation's auc: 0.827744
[160]	Train's auc: 0.847041	Evaluation's auc: 0.827913
[161]	Train's auc: 0.847134	Evaluation's auc: 0.827959
[162]	Train's auc: 0.847269	Evaluation's auc: 0.828101
[163]	Train's auc: 0.847342	Evaluation's auc: 0.828153
[164]	Train's auc: 0.847453	Evaluation's auc: 0.828225
[165]	Train's auc: 0.847552	Evaluation's auc: 0.828273
[166]	Train's auc: 0.847667	Evaluation's auc: 0.828414
[167]	Train's auc: 0.847852	Evaluation's auc: 0.828574
[168]	Train's auc: 0.847923	Evaluation's auc: 0.828592
[169]	Train's auc: 0.847999	Evaluation's auc: 0.828603
[170]	Train's auc: 0.848097	Evaluation's auc: 0.828666
[171]	Train's auc: 0.84827	Evaluation's auc: 0.828764
[172]	Train's auc: 0.848453	Evaluation's auc: 0.828881
[173]	Train's auc: 0.848551	Evaluation's auc: 0.829004
[174]	Train's auc: 0.848623	Evaluation's auc: 0.828993
[175]	Train's auc: 0.848701	Evaluation's auc: 0.829012
[176]	Train's auc: 0.848781	Evaluation's auc: 0.829036
[177]	Train's auc: 0.848843	Evaluation's auc: 0.829089
[178]	Train's auc: 0.84893	Evaluation's auc: 0.829044
[179]	Train's auc: 0.848994	Evaluation's auc: 0.829062
[180]	Train's auc: 0.849083	Evaluation's auc: 0.829123
[181]	Train's auc: 0.849267	Evaluation's auc: 0.829272
[182]	Train's auc: 0.849606	Evaluation's auc: 0.829627
[183]	Train's auc: 0.849662	Evaluation's auc: 0.829632
[184]	Train's auc: 0.849721	Evaluation's auc: 0.829666
[185]	Train's auc: 0.849782	Evaluation's auc: 0.829685
[186]	Train's auc: 0.849894	Evaluation's auc: 0.829786
[187]	Train's auc: 0.850007	Evaluation's auc: 0.829843
[188]	Train's auc: 0.850051	Evaluation's auc: 0.829924
[189]	Train's auc: 0.850112	Evaluation's auc: 0.829904
[190]	Train's auc: 0.850215	Evaluation's auc: 0.830009
[191]	Train's auc: 0.850352	Evaluation's auc: 0.830142
[192]	Train's auc: 0.850499	Evaluation's auc: 0.830301
[193]	Train's auc: 0.850578	Evaluation's auc: 0.830378
[194]	Train's auc: 0.850639	Evaluation's auc: 0.830402
[195]	Train's auc: 0.850685	Evaluation's auc: 0.830457
[196]	Train's auc: 0.850764	Evaluation's auc: 0.830546
[197]	Train's auc: 0.850861	Evaluation's auc: 0.83065
[198]	Train's auc: 0.850907	Evaluation's auc: 0.830616
[199]	Train's auc: 0.850988	Evaluation's auc: 0.83067
[200]	Train's auc: 0.851054	Evaluation's auc: 0.830681
[201]	Train's auc: 0.851154	Evaluation's auc: 0.830771
[202]	Train's auc: 0.851222	Evaluation's auc: 0.830837
[203]	Train's auc: 0.851335	Evaluation's auc: 0.830908
[204]	Train's auc: 0.851397	Evaluation's auc: 0.830942
[205]	Train's auc: 0.8515	Evaluation's auc: 0.83094
[206]	Train's auc: 0.851551	Evaluation's auc: 0.830978
[207]	Train's auc: 0.851639	Evaluation's auc: 0.831026
[208]	Train's auc: 0.851692	Evaluation's auc: 0.831043
[209]	Train's auc: 0.851749	Evaluation's auc: 0.831067
[210]	Train's auc: 0.851801	Evaluation's auc: 0.831083
[211]	Train's auc: 0.851915	Evaluation's auc: 0.831173
[212]	Train's auc: 0.852023	Evaluation's auc: 0.831178
[213]	Train's auc: 0.852151	Evaluation's auc: 0.831231
[214]	Train's auc: 0.852197	Evaluation's auc: 0.831234
[215]	Train's auc: 0.852222	Evaluation's auc: 0.831248
[216]	Train's auc: 0.852357	Evaluation's auc: 0.831317
[217]	Train's auc: 0.852426	Evaluation's auc: 0.831324
[218]	Train's auc: 0.852517	Evaluation's auc: 0.83135
[219]	Train's auc: 0.852598	Evaluation's auc: 0.831322
[220]	Train's auc: 0.852644	Evaluation's auc: 0.831347
[221]	Train's auc: 0.852758	Evaluation's auc: 0.831416
[222]	Train's auc: 0.852802	Evaluation's auc: 0.831431
[223]	Train's auc: 0.852854	Evaluation's auc: 0.831415
[224]	Train's auc: 0.853	Evaluation's auc: 0.831527
[225]	Train's auc: 0.853062	Evaluation's auc: 0.831616
[226]	Train's auc: 0.853472	Evaluation's auc: 0.831891
[227]	Train's auc: 0.853516	Evaluation's auc: 0.831873
[228]	Train's auc: 0.853606	Evaluation's auc: 0.831857
[229]	Train's auc: 0.853707	Evaluation's auc: 0.831907
[230]	Train's auc: 0.85376	Evaluation's auc: 0.831888
[231]	Train's auc: 0.85382	Evaluation's auc: 0.831883
[232]	Train's auc: 0.853876	Evaluation's auc: 0.831913
[233]	Train's auc: 0.853943	Evaluation's auc: 0.832018
[234]	Train's auc: 0.854088	Evaluation's auc: 0.832205
[235]	Train's auc: 0.854208	Evaluation's auc: 0.832295
[236]	Train's auc: 0.854264	Evaluation's auc: 0.83229
[237]	Train's auc: 0.85458	Evaluation's auc: 0.832474
[238]	Train's auc: 0.854712	Evaluation's auc: 0.832476
[239]	Train's auc: 0.854762	Evaluation's auc: 0.83248
[240]	Train's auc: 0.854857	Evaluation's auc: 0.832527
[241]	Train's auc: 0.854947	Evaluation's auc: 0.832612
[242]	Train's auc: 0.855024	Evaluation's auc: 0.83265
[243]	Train's auc: 0.855142	Evaluation's auc: 0.832769
[244]	Train's auc: 0.855227	Evaluation's auc: 0.832807
[245]	Train's auc: 0.855345	Evaluation's auc: 0.832793
[246]	Train's auc: 0.855409	Evaluation's auc: 0.832762
[247]	Train's auc: 0.855438	Evaluation's auc: 0.832741
[248]	Train's auc: 0.855469	Evaluation's auc: 0.832725
[249]	Train's auc: 0.855579	Evaluation's auc: 0.832885
[250]	Train's auc: 0.855653	Evaluation's auc: 0.832972
[251]	Train's auc: 0.855861	Evaluation's auc: 0.833177
[252]	Train's auc: 0.855906	Evaluation's auc: 0.833206
[253]	Train's auc: 0.856003	Evaluation's auc: 0.833255
[254]	Train's auc: 0.856049	Evaluation's auc: 0.833269
[255]	Train's auc: 0.856164	Evaluation's auc: 0.833381
[256]	Train's auc: 0.856251	Evaluation's auc: 0.833416
[257]	Train's auc: 0.856328	Evaluation's auc: 0.8335
[258]	Train's auc: 0.856428	Evaluation's auc: 0.83363
[259]	Train's auc: 0.856489	Evaluation's auc: 0.833655
[260]	Train's auc: 0.856548	Evaluation's auc: 0.833692
[261]	Train's auc: 0.856633	Evaluation's auc: 0.833748
[262]	Train's auc: 0.856672	Evaluation's auc: 0.833746
[263]	Train's auc: 0.856783	Evaluation's auc: 0.833799
[264]	Train's auc: 0.856855	Evaluation's auc: 0.833868
[265]	Train's auc: 0.856914	Evaluation's auc: 0.833862
[266]	Train's auc: 0.857004	Evaluation's auc: 0.833923
[267]	Train's auc: 0.857051	Evaluation's auc: 0.833934
[268]	Train's auc: 0.85713	Evaluation's auc: 0.833993
[269]	Train's auc: 0.857182	Evaluation's auc: 0.833982
[270]	Train's auc: 0.857235	Evaluation's auc: 0.834035
[271]	Train's auc: 0.85731	Evaluation's auc: 0.834078
[272]	Train's auc: 0.857389	Evaluation's auc: 0.834108
[273]	Train's auc: 0.857462	Evaluation's auc: 0.834147
[274]	Train's auc: 0.857527	Evaluation's auc: 0.834167
[275]	Train's auc: 0.857586	Evaluation's auc: 0.834139
[276]	Train's auc: 0.857618	Evaluation's auc: 0.834148
[277]	Train's auc: 0.857679	Evaluation's auc: 0.834163
[278]	Train's auc: 0.857729	Evaluation's auc: 0.834175
[279]	Train's auc: 0.857769	Evaluation's auc: 0.834139
[280]	Train's auc: 0.857832	Evaluation's auc: 0.834192
[281]	Train's auc: 0.857974	Evaluation's auc: 0.834372
[282]	Train's auc: 0.85802	Evaluation's auc: 0.834437
[283]	Train's auc: 0.858146	Evaluation's auc: 0.834523
[284]	Train's auc: 0.85823	Evaluation's auc: 0.834543
[285]	Train's auc: 0.858297	Evaluation's auc: 0.834533
[286]	Train's auc: 0.858367	Evaluation's auc: 0.834574
[287]	Train's auc: 0.858401	Evaluation's auc: 0.834589
[288]	Train's auc: 0.85849	Evaluation's auc: 0.8346
[289]	Train's auc: 0.858548	Evaluation's auc: 0.834593
[290]	Train's auc: 0.858624	Evaluation's auc: 0.834658
[291]	Train's auc: 0.85871	Evaluation's auc: 0.834722
[292]	Train's auc: 0.858787	Evaluation's auc: 0.834758
[293]	Train's auc: 0.858866	Evaluation's auc: 0.834807
[294]	Train's auc: 0.858914	Evaluation's auc: 0.834793
[295]	Train's auc: 0.858957	Evaluation's auc: 0.834748
[296]	Train's auc: 0.859066	Evaluation's auc: 0.834849
[297]	Train's auc: 0.859177	Evaluation's auc: 0.834885
[298]	Train's auc: 0.85926	Evaluation's auc: 0.834955
[299]	Train's auc: 0.859311	Evaluation's auc: 0.834977
[300]	Train's auc: 0.859363	Evaluation's auc: 0.835021
[301]	Train's auc: 0.859438	Evaluation's auc: 0.835025
[302]	Train's auc: 0.859491	Evaluation's auc: 0.83501
[303]	Train's auc: 0.859553	Evaluation's auc: 0.835087
[304]	Train's auc: 0.859575	Evaluation's auc: 0.835085
[305]	Train's auc: 0.859617	Evaluation's auc: 0.83508
[306]	Train's auc: 0.859678	Evaluation's auc: 0.835104
[307]	Train's auc: 0.859721	Evaluation's auc: 0.83512
[308]	Train's auc: 0.859751	Evaluation's auc: 0.835118
[309]	Train's auc: 0.859833	Evaluation's auc: 0.835158
[310]	Train's auc: 0.859882	Evaluation's auc: 0.835231
[311]	Train's auc: 0.859913	Evaluation's auc: 0.835268
[312]	Train's auc: 0.85998	Evaluation's auc: 0.835274
[313]	Train's auc: 0.860046	Evaluation's auc: 0.835299
[314]	Train's auc: 0.860176	Evaluation's auc: 0.835401
[315]	Train's auc: 0.860263	Evaluation's auc: 0.835469
[316]	Train's auc: 0.860331	Evaluation's auc: 0.83557
[317]	Train's auc: 0.860411	Evaluation's auc: 0.835615
[318]	Train's auc: 0.860475	Evaluation's auc: 0.835652
[319]	Train's auc: 0.860541	Evaluation's auc: 0.8357
[320]	Train's auc: 0.860609	Evaluation's auc: 0.835712
[321]	Train's auc: 0.860639	Evaluation's auc: 0.835756
[322]	Train's auc: 0.860705	Evaluation's auc: 0.835821
[323]	Train's auc: 0.860738	Evaluation's auc: 0.835811
[324]	Train's auc: 0.860783	Evaluation's auc: 0.835821
[325]	Train's auc: 0.860865	Evaluation's auc: 0.835858
[326]	Train's auc: 0.86092	Evaluation's auc: 0.835876
[327]	Train's auc: 0.860982	Evaluation's auc: 0.835885
[328]	Train's auc: 0.861026	Evaluation's auc: 0.835848
[329]	Train's auc: 0.861087	Evaluation's auc: 0.835832
[330]	Train's auc: 0.861123	Evaluation's auc: 0.835862
[331]	Train's auc: 0.861181	Evaluation's auc: 0.835889
[332]	Train's auc: 0.861314	Evaluation's auc: 0.836045
[333]	Train's auc: 0.861361	Evaluation's auc: 0.836054
[334]	Train's auc: 0.861468	Evaluation's auc: 0.836142
[335]	Train's auc: 0.861511	Evaluation's auc: 0.836134
[336]	Train's auc: 0.861617	Evaluation's auc: 0.836181
[337]	Train's auc: 0.86168	Evaluation's auc: 0.83619
[338]	Train's auc: 0.861757	Evaluation's auc: 0.836247
[339]	Train's auc: 0.861815	Evaluation's auc: 0.836277
[340]	Train's auc: 0.861865	Evaluation's auc: 0.836348
[341]	Train's auc: 0.861928	Evaluation's auc: 0.836321
[342]	Train's auc: 0.862031	Evaluation's auc: 0.836468
[343]	Train's auc: 0.86214	Evaluation's auc: 0.83656
[344]	Train's auc: 0.862194	Evaluation's auc: 0.836562
[345]	Train's auc: 0.862245	Evaluation's auc: 0.83655
[346]	Train's auc: 0.862279	Evaluation's auc: 0.836562
[347]	Train's auc: 0.862341	Evaluation's auc: 0.836602
[348]	Train's auc: 0.862432	Evaluation's auc: 0.836614
[349]	Train's auc: 0.862464	Evaluation's auc: 0.836629
[350]	Train's auc: 0.862502	Evaluation's auc: 0.836602
[351]	Train's auc: 0.862565	Evaluation's auc: 0.836625
[352]	Train's auc: 0.862595	Evaluation's auc: 0.836588
[353]	Train's auc: 0.86264	Evaluation's auc: 0.836589
[354]	Train's auc: 0.862715	Evaluation's auc: 0.836612
[355]	Train's auc: 0.86275	Evaluation's auc: 0.836651
[356]	Train's auc: 0.862806	Evaluation's auc: 0.836678
[357]	Train's auc: 0.862861	Evaluation's auc: 0.836681
[358]	Train's auc: 0.862909	Evaluation's auc: 0.836699
[359]	Train's auc: 0.862948	Evaluation's auc: 0.836687
[360]	Train's auc: 0.862995	Evaluation's auc: 0.836645
[361]	Train's auc: 0.863047	Evaluation's auc: 0.836657
[362]	Train's auc: 0.86308	Evaluation's auc: 0.836676
[363]	Train's auc: 0.863148	Evaluation's auc: 0.836726
[364]	Train's auc: 0.863191	Evaluation's auc: 0.836752
[365]	Train's auc: 0.863294	Evaluation's auc: 0.836892
[366]	Train's auc: 0.863332	Evaluation's auc: 0.836914
[367]	Train's auc: 0.863402	Evaluation's auc: 0.836941
[368]	Train's auc: 0.863463	Evaluation's auc: 0.836975
[369]	Train's auc: 0.863536	Evaluation's auc: 0.837029
[370]	Train's auc: 0.863587	Evaluation's auc: 0.837016
[371]	Train's auc: 0.863628	Evaluation's auc: 0.836997
[372]	Train's auc: 0.863711	Evaluation's auc: 0.837043
[373]	Train's auc: 0.863756	Evaluation's auc: 0.837066
[374]	Train's auc: 0.86381	Evaluation's auc: 0.837103
[375]	Train's auc: 0.863857	Evaluation's auc: 0.837098
[376]	Train's auc: 0.863908	Evaluation's auc: 0.837101
[377]	Train's auc: 0.863976	Evaluation's auc: 0.83709
[378]	Train's auc: 0.864013	Evaluation's auc: 0.837099
[379]	Train's auc: 0.864066	Evaluation's auc: 0.837082
[380]	Train's auc: 0.864101	Evaluation's auc: 0.837043
[381]	Train's auc: 0.864171	Evaluation's auc: 0.837044
[382]	Train's auc: 0.864244	Evaluation's auc: 0.837104
[383]	Train's auc: 0.86428	Evaluation's auc: 0.837045
[384]	Train's auc: 0.864325	Evaluation's auc: 0.837111
[385]	Train's auc: 0.864376	Evaluation's auc: 0.837141
[386]	Train's auc: 0.864444	Evaluation's auc: 0.837169
[387]	Train's auc: 0.864498	Evaluation's auc: 0.837162
[388]	Train's auc: 0.864535	Evaluation's auc: 0.837208
[389]	Train's auc: 0.864577	Evaluation's auc: 0.837253
[390]	Train's auc: 0.864633	Evaluation's auc: 0.837277
[391]	Train's auc: 0.864706	Evaluation's auc: 0.837277
[392]	Train's auc: 0.864759	Evaluation's auc: 0.837269
[393]	Train's auc: 0.864804	Evaluation's auc: 0.837275
[394]	Train's auc: 0.864847	Evaluation's auc: 0.837288
[395]	Train's auc: 0.864927	Evaluation's auc: 0.837292
[396]	Train's auc: 0.864992	Evaluation's auc: 0.837265
[397]	Train's auc: 0.865023	Evaluation's auc: 0.837285
[398]	Train's auc: 0.865076	Evaluation's auc: 0.837318
[399]	Train's auc: 0.86513	Evaluation's auc: 0.837329
[400]	Train's auc: 0.865163	Evaluation's auc: 0.837315
[401]	Train's auc: 0.865235	Evaluation's auc: 0.837425
[402]	Train's auc: 0.865278	Evaluation's auc: 0.837429
[403]	Train's auc: 0.865347	Evaluation's auc: 0.837463
[404]	Train's auc: 0.865395	Evaluation's auc: 0.837476
[405]	Train's auc: 0.865445	Evaluation's auc: 0.837513
[406]	Train's auc: 0.865509	Evaluation's auc: 0.837608
[407]	Train's auc: 0.865583	Evaluation's auc: 0.837628
[408]	Train's auc: 0.865657	Evaluation's auc: 0.837651
[409]	Train's auc: 0.865691	Evaluation's auc: 0.837662
[410]	Train's auc: 0.865719	Evaluation's auc: 0.837663
[411]	Train's auc: 0.865762	Evaluation's auc: 0.837661
[412]	Train's auc: 0.865787	Evaluation's auc: 0.837649
[413]	Train's auc: 0.865807	Evaluation's auc: 0.837641
[414]	Train's auc: 0.865838	Evaluation's auc: 0.837637
[415]	Train's auc: 0.86589	Evaluation's auc: 0.837615
[416]	Train's auc: 0.865957	Evaluation's auc: 0.837637
[417]	Train's auc: 0.866009	Evaluation's auc: 0.837625
[418]	Train's auc: 0.866055	Evaluation's auc: 0.837585
[419]	Train's auc: 0.866098	Evaluation's auc: 0.83758
[420]	Train's auc: 0.866142	Evaluation's auc: 0.837605
[421]	Train's auc: 0.866197	Evaluation's auc: 0.837613
[422]	Train's auc: 0.866269	Evaluation's auc: 0.837672
[423]	Train's auc: 0.866307	Evaluation's auc: 0.837675
[424]	Train's auc: 0.866347	Evaluation's auc: 0.837661
[425]	Train's auc: 0.866389	Evaluation's auc: 0.837677
[426]	Train's auc: 0.866457	Evaluation's auc: 0.837724
[427]	Train's auc: 0.866512	Evaluation's auc: 0.837724
[428]	Train's auc: 0.866564	Evaluation's auc: 0.837758
[429]	Train's auc: 0.866643	Evaluation's auc: 0.837798
[430]	Train's auc: 0.866706	Evaluation's auc: 0.837788
[431]	Train's auc: 0.86676	Evaluation's auc: 0.837793
[432]	Train's auc: 0.866833	Evaluation's auc: 0.837856
[433]	Train's auc: 0.866912	Evaluation's auc: 0.837887
[434]	Train's auc: 0.866977	Evaluation's auc: 0.837885
[435]	Train's auc: 0.867031	Evaluation's auc: 0.837877
[436]	Train's auc: 0.867067	Evaluation's auc: 0.837868
[437]	Train's auc: 0.867124	Evaluation's auc: 0.837867
[438]	Train's auc: 0.867175	Evaluation's auc: 0.837859
[439]	Train's auc: 0.867208	Evaluation's auc: 0.837878
[440]	Train's auc: 0.867237	Evaluation's auc: 0.8379
[441]	Train's auc: 0.867289	Evaluation's auc: 0.837917
[442]	Train's auc: 0.867318	Evaluation's auc: 0.837927
[443]	Train's auc: 0.867371	Evaluation's auc: 0.837949
[444]	Train's auc: 0.867403	Evaluation's auc: 0.837962
[445]	Train's auc: 0.867468	Evaluation's auc: 0.837981
[446]	Train's auc: 0.867526	Evaluation's auc: 0.838007
[447]	Train's auc: 0.867579	Evaluation's auc: 0.83802
[448]	Train's auc: 0.867618	Evaluation's auc: 0.838004
[449]	Train's auc: 0.867709	Evaluation's auc: 0.838125
[450]	Train's auc: 0.867733	Evaluation's auc: 0.838112
[451]	Train's auc: 0.867782	Evaluation's auc: 0.838121
[452]	Train's auc: 0.867808	Evaluation's auc: 0.838119
[453]	Train's auc: 0.867855	Evaluation's auc: 0.83813
[454]	Train's auc: 0.867902	Evaluation's auc: 0.838111
[455]	Train's auc: 0.867955	Evaluation's auc: 0.838137
[456]	Train's auc: 0.868002	Evaluation's auc: 0.838157
[457]	Train's auc: 0.868092	Evaluation's auc: 0.838249
[458]	Train's auc: 0.868132	Evaluation's auc: 0.838296
[459]	Train's auc: 0.868212	Evaluation's auc: 0.838395
[460]	Train's auc: 0.868254	Evaluation's auc: 0.838364
[461]	Train's auc: 0.868282	Evaluation's auc: 0.838337
[462]	Train's auc: 0.868312	Evaluation's auc: 0.838348
[463]	Train's auc: 0.86839	Evaluation's auc: 0.838409
[464]	Train's auc: 0.868445	Evaluation's auc: 0.838417
[465]	Train's auc: 0.86848	Evaluation's auc: 0.838431
[466]	Train's auc: 0.868522	Evaluation's auc: 0.838405
[467]	Train's auc: 0.868552	Evaluation's auc: 0.838433
[468]	Train's auc: 0.868588	Evaluation's auc: 0.838447
[469]	Train's auc: 0.868644	Evaluation's auc: 0.838476
[470]	Train's auc: 0.868682	Evaluation's auc: 0.838463
[471]	Train's auc: 0.868751	Evaluation's auc: 0.838515
[472]	Train's auc: 0.868825	Evaluation's auc: 0.838557
[473]	Train's auc: 0.868857	Evaluation's auc: 0.838542
[474]	Train's auc: 0.868913	Evaluation's auc: 0.838562
[475]	Train's auc: 0.868971	Evaluation's auc: 0.838572
[476]	Train's auc: 0.869025	Evaluation's auc: 0.838609
[477]	Train's auc: 0.869059	Evaluation's auc: 0.838611
[478]	Train's auc: 0.869109	Evaluation's auc: 0.838612
[479]	Train's auc: 0.869134	Evaluation's auc: 0.838582
[480]	Train's auc: 0.869165	Evaluation's auc: 0.8386
[481]	Train's auc: 0.869221	Evaluation's auc: 0.838633
[482]	Train's auc: 0.869258	Evaluation's auc: 0.838597
[483]	Train's auc: 0.869294	Evaluation's auc: 0.838588
[484]	Train's auc: 0.869348	Evaluation's auc: 0.838589
[485]	Train's auc: 0.869401	Evaluation's auc: 0.83861
[486]	Train's auc: 0.869423	Evaluation's auc: 0.838599
[487]	Train's auc: 0.869469	Evaluation's auc: 0.838615
[488]	Train's auc: 0.8695	Evaluation's auc: 0.838636
[489]	Train's auc: 0.86953	Evaluation's auc: 0.838615
[490]	Train's auc: 0.869561	Evaluation's auc: 0.838606
[491]	Train's auc: 0.86968	Evaluation's auc: 0.838676
[492]	Train's auc: 0.869707	Evaluation's auc: 0.838665
[493]	Train's auc: 0.869781	Evaluation's auc: 0.838672
[494]	Train's auc: 0.869868	Evaluation's auc: 0.83874
[495]	Train's auc: 0.869892	Evaluation's auc: 0.838729
[496]	Train's auc: 0.869933	Evaluation's auc: 0.838727
[497]	Train's auc: 0.86997	Evaluation's auc: 0.838736
[498]	Train's auc: 0.870018	Evaluation's auc: 0.838744
[499]	Train's auc: 0.870077	Evaluation's auc: 0.838746
[500]	Train's auc: 0.870147	Evaluation's auc: 0.838776
[501]	Train's auc: 0.870192	Evaluation's auc: 0.838761
[502]	Train's auc: 0.87021	Evaluation's auc: 0.838759
[503]	Train's auc: 0.870233	Evaluation's auc: 0.838763
[504]	Train's auc: 0.870295	Evaluation's auc: 0.838758
[505]	Train's auc: 0.87031	Evaluation's auc: 0.83874
[506]	Train's auc: 0.870351	Evaluation's auc: 0.838745
[507]	Train's auc: 0.870397	Evaluation's auc: 0.838804
[508]	Train's auc: 0.870428	Evaluation's auc: 0.838808
[509]	Train's auc: 0.870466	Evaluation's auc: 0.838813
[510]	Train's auc: 0.870498	Evaluation's auc: 0.838784
[511]	Train's auc: 0.870519	Evaluation's auc: 0.83877
[512]	Train's auc: 0.870546	Evaluation's auc: 0.838765
[513]	Train's auc: 0.87059	Evaluation's auc: 0.838765
[514]	Train's auc: 0.870644	Evaluation's auc: 0.838791
[515]	Train's auc: 0.870665	Evaluation's auc: 0.838766
[516]	Train's auc: 0.870699	Evaluation's auc: 0.838738
[517]	Train's auc: 0.87074	Evaluation's auc: 0.838735
[518]	Train's auc: 0.870777	Evaluation's auc: 0.838777
[519]	Train's auc: 0.870828	Evaluation's auc: 0.838827
[520]	Train's auc: 0.870942	Evaluation's auc: 0.838948
[521]	Train's auc: 0.870962	Evaluation's auc: 0.838946
[522]	Train's auc: 0.87103	Evaluation's auc: 0.838993
[523]	Train's auc: 0.871083	Evaluation's auc: 0.838979
[524]	Train's auc: 0.871117	Evaluation's auc: 0.838982
[525]	Train's auc: 0.871174	Evaluation's auc: 0.839026
[526]	Train's auc: 0.87121	Evaluation's auc: 0.839042
[527]	Train's auc: 0.871271	Evaluation's auc: 0.839044
[528]	Train's auc: 0.871315	Evaluation's auc: 0.839064
[529]	Train's auc: 0.871345	Evaluation's auc: 0.839064
[530]	Train's auc: 0.871381	Evaluation's auc: 0.839066
[531]	Train's auc: 0.871426	Evaluation's auc: 0.839094
[532]	Train's auc: 0.871457	Evaluation's auc: 0.8391
[533]	Train's auc: 0.871503	Evaluation's auc: 0.839101
[534]	Train's auc: 0.871553	Evaluation's auc: 0.839126
[535]	Train's auc: 0.871596	Evaluation's auc: 0.839082
[536]	Train's auc: 0.871653	Evaluation's auc: 0.839129
[537]	Train's auc: 0.871699	Evaluation's auc: 0.839162
[538]	Train's auc: 0.871799	Evaluation's auc: 0.83925
[539]	Train's auc: 0.871839	Evaluation's auc: 0.839226
[540]	Train's auc: 0.871916	Evaluation's auc: 0.839307
[541]	Train's auc: 0.87197	Evaluation's auc: 0.839312
[542]	Train's auc: 0.872009	Evaluation's auc: 0.839326
[543]	Train's auc: 0.872049	Evaluation's auc: 0.839368
[544]	Train's auc: 0.872084	Evaluation's auc: 0.839388
[545]	Train's auc: 0.872118	Evaluation's auc: 0.839379
[546]	Train's auc: 0.872153	Evaluation's auc: 0.839386
[547]	Train's auc: 0.872251	Evaluation's auc: 0.839517
[548]	Train's auc: 0.872323	Evaluation's auc: 0.839589
[549]	Train's auc: 0.872374	Evaluation's auc: 0.839577
[550]	Train's auc: 0.872398	Evaluation's auc: 0.83958
[551]	Train's auc: 0.872465	Evaluation's auc: 0.839643
[552]	Train's auc: 0.872519	Evaluation's auc: 0.839671
[553]	Train's auc: 0.872557	Evaluation's auc: 0.839663
[554]	Train's auc: 0.872612	Evaluation's auc: 0.839647
[555]	Train's auc: 0.872633	Evaluation's auc: 0.839651
[556]	Train's auc: 0.872684	Evaluation's auc: 0.839651
[557]	Train's auc: 0.872723	Evaluation's auc: 0.839635
[558]	Train's auc: 0.872764	Evaluation's auc: 0.83965
[559]	Train's auc: 0.872818	Evaluation's auc: 0.8397
[560]	Train's auc: 0.872893	Evaluation's auc: 0.839814
[561]	Train's auc: 0.872933	Evaluation's auc: 0.83984
[562]	Train's auc: 0.873	Evaluation's auc: 0.839859
[563]	Train's auc: 0.873036	Evaluation's auc: 0.839894
[564]	Train's auc: 0.873056	Evaluation's auc: 0.839884
[565]	Train's auc: 0.873095	Evaluation's auc: 0.839881
[566]	Train's auc: 0.87312	Evaluation's auc: 0.839875
[567]	Train's auc: 0.873144	Evaluation's auc: 0.839906
[568]	Train's auc: 0.873169	Evaluation's auc: 0.839913
[569]	Train's auc: 0.873229	Evaluation's auc: 0.840007
[570]	Train's auc: 0.873303	Evaluation's auc: 0.840098
[571]	Train's auc: 0.873358	Evaluation's auc: 0.840123
[572]	Train's auc: 0.873401	Evaluation's auc: 0.840153
[573]	Train's auc: 0.873432	Evaluation's auc: 0.840182
[574]	Train's auc: 0.873494	Evaluation's auc: 0.840202
[575]	Train's auc: 0.873535	Evaluation's auc: 0.840202
[576]	Train's auc: 0.873581	Evaluation's auc: 0.840241
[577]	Train's auc: 0.873625	Evaluation's auc: 0.840261
[578]	Train's auc: 0.873686	Evaluation's auc: 0.840284
[579]	Train's auc: 0.873715	Evaluation's auc: 0.840258
[580]	Train's auc: 0.873747	Evaluation's auc: 0.840257
[581]	Train's auc: 0.873798	Evaluation's auc: 0.840262
[582]	Train's auc: 0.873837	Evaluation's auc: 0.840261
[583]	Train's auc: 0.873878	Evaluation's auc: 0.840281
[584]	Train's auc: 0.873939	Evaluation's auc: 0.840304
[585]	Train's auc: 0.874006	Evaluation's auc: 0.840329
[586]	Train's auc: 0.874053	Evaluation's auc: 0.840363
[587]	Train's auc: 0.874101	Evaluation's auc: 0.840391
[588]	Train's auc: 0.874145	Evaluation's auc: 0.840403
[589]	Train's auc: 0.874201	Evaluation's auc: 0.840441
[590]	Train's auc: 0.874249	Evaluation's auc: 0.840483
[591]	Train's auc: 0.874291	Evaluation's auc: 0.840526
[592]	Train's auc: 0.874308	Evaluation's auc: 0.840531
[593]	Train's auc: 0.874334	Evaluation's auc: 0.840534
[594]	Train's auc: 0.874377	Evaluation's auc: 0.840556
[595]	Train's auc: 0.874412	Evaluation's auc: 0.840547
[596]	Train's auc: 0.874429	Evaluation's auc: 0.840534
[597]	Train's auc: 0.874462	Evaluation's auc: 0.840572
[598]	Train's auc: 0.874523	Evaluation's auc: 0.840597
[599]	Train's auc: 0.874557	Evaluation's auc: 0.840599
[600]	Train's auc: 0.874572	Evaluation's auc: 0.840625
[601]	Train's auc: 0.87461	Evaluation's auc: 0.840682
[602]	Train's auc: 0.874627	Evaluation's auc: 0.840693
[603]	Train's auc: 0.87468	Evaluation's auc: 0.840705
[604]	Train's auc: 0.874717	Evaluation's auc: 0.840693
[605]	Train's auc: 0.874739	Evaluation's auc: 0.840683
[606]	Train's auc: 0.874771	Evaluation's auc: 0.840681
[607]	Train's auc: 0.87483	Evaluation's auc: 0.840702
[608]	Train's auc: 0.874874	Evaluation's auc: 0.840712
[609]	Train's auc: 0.874888	Evaluation's auc: 0.840688
[610]	Train's auc: 0.87491	Evaluation's auc: 0.840721
[611]	Train's auc: 0.874951	Evaluation's auc: 0.840729
[612]	Train's auc: 0.874979	Evaluation's auc: 0.840722
[613]	Train's auc: 0.874995	Evaluation's auc: 0.840712
[614]	Train's auc: 0.875018	Evaluation's auc: 0.840695
[615]	Train's auc: 0.875039	Evaluation's auc: 0.840659
[616]	Train's auc: 0.875069	Evaluation's auc: 0.840694
[617]	Train's auc: 0.875106	Evaluation's auc: 0.84071
[618]	Train's auc: 0.875142	Evaluation's auc: 0.840668
[619]	Train's auc: 0.875178	Evaluation's auc: 0.840651
[620]	Train's auc: 0.875211	Evaluation's auc: 0.840649
[621]	Train's auc: 0.875261	Evaluation's auc: 0.840674
[622]	Train's auc: 0.875281	Evaluation's auc: 0.840688
[623]	Train's auc: 0.875325	Evaluation's auc: 0.840679
[624]	Train's auc: 0.875361	Evaluation's auc: 0.84069
[625]	Train's auc: 0.875397	Evaluation's auc: 0.840673
[626]	Train's auc: 0.875435	Evaluation's auc: 0.840662
[627]	Train's auc: 0.875473	Evaluation's auc: 0.84065
[628]	Train's auc: 0.875521	Evaluation's auc: 0.840631
[629]	Train's auc: 0.875571	Evaluation's auc: 0.840691
[630]	Train's auc: 0.875616	Evaluation's auc: 0.840705
[631]	Train's auc: 0.875644	Evaluation's auc: 0.8407
Early stopping, best iteration is:
[611]	Train's auc: 0.874951	Evaluation's auc: 0.840729
In [60]:
y_pred = gbm_rh.predict(x_val)
y_pred = np.where(y_pred > 0.5, 1, 0) # convert into binary values
In [61]:
# Accuracy
from sklearn.metrics import accuracy_score
accuracy = accuracy_score(y_pred, y_val)
accuracy
Out[61]:
0.824765625
In [62]:
# Confusion matrix
from sklearn.metrics import confusion_matrix
cm = confusion_matrix(y_val, y_pred)
cm
Out[62]:
array([[9410,  607],
       [1636, 1147]])
In [63]:
# We can concatenate x_train and x_val
dfs = [x_train, x_val]
dfsy = [y_train, y_val]
x_train_rf = pd.concat(dfs)
y_train_rf = pd.concat(dfsy)
In [64]:
lightgbm_hp_train = lgb.Dataset(x_train_rf, y_train_rf)
In [65]:
lightgbm_hp_train = lgb.Dataset(x_train, y_train)
In [66]:
lightgbm_hp_val = lgb.Dataset(x_val, y_val)
In [67]:
# Sets the space to search over and the prior probabilities over the search space 
lgbm_space = {
# hp.choice.choice will select 1 value from the given list  , 'dart', 'goss', 'rf'
    'boosting_type': hp.hp.choice('boosting_type',  ['gbdt']),
    'num_leaves':hp.hp.choice('num_leaves', np.arange(10, 300,1, dtype=int)),
    'subsample':hp.hp.quniform('subsample',0.5,1.0,0.05),
    'colsample_bytree':hp.hp.quniform('colsample_bytree',0.5,1.0,0.05),
    'min_child_weight':hp.hp.quniform('min_child_weight', 100, 1000,100),
    'reg_alpha': hp.hp.uniform('reg_alpha', 0.0, 1000.0),
    'reg_lambda': hp.hp.uniform('reg_lambda', 0.0, 1000.0),
    'learning_rate': hp.hp.loguniform('learning_rate', -4, 0),
    'feature_fraction': hp.hp.loguniform('feature_fraction', -4, 0),
    'bagging_fraction': hp.hp.loguniform('bagging_fraction', -4, 0),
    'bagging_frequency':hp.hp.choice('bagging_frequency', np.arange(5, 100,1, dtype=int)),
    'drop_rate': hp.hp.loguniform('drop_rate', -4, 0),
    'scale_pos_weight': hp.hp.uniform('scale_pos_weight', 6.0, 10.0),
    'metric' : 'auc',
    'nthread': 6, 
    'max_bin': 512
    }
In [68]:
# Here we define an objective (loss) function I take 
def objective_m(params, n_folds=5):

    model = lgb.cv(params = params,
              train_set = lightgbm_hp_train,
              num_boost_round = 10000,
              early_stopping_rounds = 10,
             nfold = n_folds)
  
    # Returns the best average loss on validation set 
    
    loss = 1 - (max(model['auc-mean']))
    return loss


bayes_trials = Trials()
MAX_EVALS = 10 # this controls the runtime 

lgbm_best_m = fmin(fn = objective_m, space = lgbm_space, algo = hp.tpe.suggest, 
max_evals = MAX_EVALS, trials = bayes_trials)
100%|██████████| 10/10 [00:13<00:00,  1.30s/trial, best loss: 0.17421290189754757]
In [69]:
# LightGBM best parameters
lgbm_best_m
Out[69]:
{'bagging_fraction': 0.21835717243856412,
 'bagging_frequency': 25,
 'boosting_type': 0,
 'colsample_bytree': 0.9,
 'drop_rate': 0.09585306765988078,
 'feature_fraction': 0.39962859971210074,
 'learning_rate': 0.29662173472784276,
 'min_child_weight': 800.0,
 'num_leaves': 114,
 'reg_alpha': 108.33230702491204,
 'reg_lambda': 756.4823876094997,
 'scale_pos_weight': 8.768176035187624,
 'subsample': 0.8}
In [70]:
# lgb parameters
lgb_params = {
 'objective': 'binary',
 'metric': 'auc',   
 'bagging_fraction': 0.13,
 'bagging_frequency': 35,
 'boosting_type': 'gbdt',
 'colsample_bytree': 0.7,
 'drop_rate': 0.49,
 'feature_fraction': 0.77,
 'learning_rate': 0.04,
 'min_child_weight': 700.0,
 'num_leaves': 201,
 'reg_alpha': 73,
 'reg_lambda': 958,
 'scale_pos_weight': 6,
 'subsample': 0.65,
 'early_stopping_round' : 20,
}
In [71]:
lgb_proc_train_rh = lgb.Dataset(x_train, y_train)
lgb_proc_val_rh = lgb.Dataset(x_val, y_val)
In [72]:
gbm_rh = lgb.train(params = lgb_params, train_set = lgb_proc_train_rh,
                num_boost_round = 5000, valid_sets = [lgb_proc_val_rh, lgb_proc_train_rh],
               valid_names = ['Evaluation', 'Train'])
/Users/piumallick/anaconda3/lib/python3.7/site-packages/lightgbm/engine.py:153: UserWarning: Found `early_stopping_round` in params. Will use it instead of argument
  warnings.warn("Found `{}` in params. Will use it instead of argument".format(alias))
[1]	Train's auc: 0.682345	Evaluation's auc: 0.680424
Training until validation scores don't improve for 20 rounds
[2]	Train's auc: 0.684777	Evaluation's auc: 0.682685
[3]	Train's auc: 0.685064	Evaluation's auc: 0.682793
[4]	Train's auc: 0.704032	Evaluation's auc: 0.703819
[5]	Train's auc: 0.707238	Evaluation's auc: 0.70653
[6]	Train's auc: 0.711649	Evaluation's auc: 0.710117
[7]	Train's auc: 0.715827	Evaluation's auc: 0.712948
[8]	Train's auc: 0.717009	Evaluation's auc: 0.71484
[9]	Train's auc: 0.724052	Evaluation's auc: 0.720211
[10]	Train's auc: 0.72623	Evaluation's auc: 0.72208
[11]	Train's auc: 0.727497	Evaluation's auc: 0.722427
[12]	Train's auc: 0.730778	Evaluation's auc: 0.725339
[13]	Train's auc: 0.730931	Evaluation's auc: 0.725185
[14]	Train's auc: 0.732774	Evaluation's auc: 0.726345
[15]	Train's auc: 0.741156	Evaluation's auc: 0.735334
[16]	Train's auc: 0.741679	Evaluation's auc: 0.735319
[17]	Train's auc: 0.742327	Evaluation's auc: 0.736243
[18]	Train's auc: 0.74373	Evaluation's auc: 0.738387
[19]	Train's auc: 0.748951	Evaluation's auc: 0.743527
[20]	Train's auc: 0.749702	Evaluation's auc: 0.744021
[21]	Train's auc: 0.751328	Evaluation's auc: 0.745851
[22]	Train's auc: 0.757033	Evaluation's auc: 0.750455
[23]	Train's auc: 0.760419	Evaluation's auc: 0.753609
[24]	Train's auc: 0.760751	Evaluation's auc: 0.754055
[25]	Train's auc: 0.762568	Evaluation's auc: 0.75586
[26]	Train's auc: 0.763873	Evaluation's auc: 0.757298
[27]	Train's auc: 0.767274	Evaluation's auc: 0.760369
[28]	Train's auc: 0.768987	Evaluation's auc: 0.761772
[29]	Train's auc: 0.771408	Evaluation's auc: 0.763401
[30]	Train's auc: 0.772444	Evaluation's auc: 0.764248
[31]	Train's auc: 0.773445	Evaluation's auc: 0.765193
[32]	Train's auc: 0.774937	Evaluation's auc: 0.766818
[33]	Train's auc: 0.777455	Evaluation's auc: 0.769038
[34]	Train's auc: 0.778008	Evaluation's auc: 0.769484
[35]	Train's auc: 0.778991	Evaluation's auc: 0.770404
[36]	Train's auc: 0.780355	Evaluation's auc: 0.772013
[37]	Train's auc: 0.781287	Evaluation's auc: 0.77276
[38]	Train's auc: 0.782719	Evaluation's auc: 0.774058
[39]	Train's auc: 0.784121	Evaluation's auc: 0.774906
[40]	Train's auc: 0.78494	Evaluation's auc: 0.775636
[41]	Train's auc: 0.785764	Evaluation's auc: 0.776307
[42]	Train's auc: 0.787059	Evaluation's auc: 0.777668
[43]	Train's auc: 0.787726	Evaluation's auc: 0.778195
[44]	Train's auc: 0.789141	Evaluation's auc: 0.779472
[45]	Train's auc: 0.789892	Evaluation's auc: 0.779898
[46]	Train's auc: 0.79021	Evaluation's auc: 0.780354
[47]	Train's auc: 0.790988	Evaluation's auc: 0.780988
[48]	Train's auc: 0.791677	Evaluation's auc: 0.78142
[49]	Train's auc: 0.792878	Evaluation's auc: 0.782477
[50]	Train's auc: 0.793524	Evaluation's auc: 0.78305
[51]	Train's auc: 0.794128	Evaluation's auc: 0.783434
[52]	Train's auc: 0.794424	Evaluation's auc: 0.783719
[53]	Train's auc: 0.794744	Evaluation's auc: 0.783896
[54]	Train's auc: 0.795739	Evaluation's auc: 0.784787
[55]	Train's auc: 0.796692	Evaluation's auc: 0.785545
[56]	Train's auc: 0.797188	Evaluation's auc: 0.785848
[57]	Train's auc: 0.797248	Evaluation's auc: 0.785944
[58]	Train's auc: 0.797748	Evaluation's auc: 0.786236
[59]	Train's auc: 0.798077	Evaluation's auc: 0.786397
[60]	Train's auc: 0.798148	Evaluation's auc: 0.786502
[61]	Train's auc: 0.798595	Evaluation's auc: 0.78677
[62]	Train's auc: 0.799034	Evaluation's auc: 0.786984
[63]	Train's auc: 0.799647	Evaluation's auc: 0.787409
[64]	Train's auc: 0.800784	Evaluation's auc: 0.788479
[65]	Train's auc: 0.801088	Evaluation's auc: 0.788708
[66]	Train's auc: 0.801127	Evaluation's auc: 0.788857
[67]	Train's auc: 0.801829	Evaluation's auc: 0.789394
[68]	Train's auc: 0.802416	Evaluation's auc: 0.789864
[69]	Train's auc: 0.802759	Evaluation's auc: 0.790021
[70]	Train's auc: 0.804013	Evaluation's auc: 0.791301
[71]	Train's auc: 0.805146	Evaluation's auc: 0.792287
[72]	Train's auc: 0.806343	Evaluation's auc: 0.793446
[73]	Train's auc: 0.806478	Evaluation's auc: 0.793683
[74]	Train's auc: 0.806675	Evaluation's auc: 0.793764
[75]	Train's auc: 0.806842	Evaluation's auc: 0.794032
[76]	Train's auc: 0.807868	Evaluation's auc: 0.795029
[77]	Train's auc: 0.808034	Evaluation's auc: 0.795108
[78]	Train's auc: 0.808514	Evaluation's auc: 0.795514
[79]	Train's auc: 0.809349	Evaluation's auc: 0.796362
[80]	Train's auc: 0.809684	Evaluation's auc: 0.796624
[81]	Train's auc: 0.809967	Evaluation's auc: 0.796716
[82]	Train's auc: 0.810985	Evaluation's auc: 0.797692
[83]	Train's auc: 0.811846	Evaluation's auc: 0.798567
[84]	Train's auc: 0.81277	Evaluation's auc: 0.799562
[85]	Train's auc: 0.813609	Evaluation's auc: 0.800464
[86]	Train's auc: 0.814045	Evaluation's auc: 0.800851
[87]	Train's auc: 0.814261	Evaluation's auc: 0.800975
[88]	Train's auc: 0.81485	Evaluation's auc: 0.801499
[89]	Train's auc: 0.815029	Evaluation's auc: 0.801643
[90]	Train's auc: 0.815158	Evaluation's auc: 0.801856
[91]	Train's auc: 0.81584	Evaluation's auc: 0.802437
[92]	Train's auc: 0.816041	Evaluation's auc: 0.802555
[93]	Train's auc: 0.816191	Evaluation's auc: 0.802678
[94]	Train's auc: 0.816795	Evaluation's auc: 0.803288
[95]	Train's auc: 0.816951	Evaluation's auc: 0.803304
[96]	Train's auc: 0.817052	Evaluation's auc: 0.803378
[97]	Train's auc: 0.817218	Evaluation's auc: 0.80348
[98]	Train's auc: 0.81785	Evaluation's auc: 0.804136
[99]	Train's auc: 0.818506	Evaluation's auc: 0.804788
[100]	Train's auc: 0.818974	Evaluation's auc: 0.805103
[101]	Train's auc: 0.819078	Evaluation's auc: 0.805229
[102]	Train's auc: 0.819376	Evaluation's auc: 0.805506
[103]	Train's auc: 0.819509	Evaluation's auc: 0.805654
[104]	Train's auc: 0.82007	Evaluation's auc: 0.806231
[105]	Train's auc: 0.820606	Evaluation's auc: 0.806694
[106]	Train's auc: 0.820772	Evaluation's auc: 0.806848
[107]	Train's auc: 0.820859	Evaluation's auc: 0.806915
[108]	Train's auc: 0.821116	Evaluation's auc: 0.807156
[109]	Train's auc: 0.821703	Evaluation's auc: 0.807748
[110]	Train's auc: 0.8219	Evaluation's auc: 0.807856
[111]	Train's auc: 0.822112	Evaluation's auc: 0.808079
[112]	Train's auc: 0.822527	Evaluation's auc: 0.808481
[113]	Train's auc: 0.822671	Evaluation's auc: 0.8086
[114]	Train's auc: 0.822728	Evaluation's auc: 0.808662
[115]	Train's auc: 0.82283	Evaluation's auc: 0.808739
[116]	Train's auc: 0.822935	Evaluation's auc: 0.808768
[117]	Train's auc: 0.82352	Evaluation's auc: 0.809418
[118]	Train's auc: 0.824046	Evaluation's auc: 0.809985
[119]	Train's auc: 0.824172	Evaluation's auc: 0.810114
[120]	Train's auc: 0.82457	Evaluation's auc: 0.81045
[121]	Train's auc: 0.824763	Evaluation's auc: 0.81064
[122]	Train's auc: 0.824813	Evaluation's auc: 0.810621
[123]	Train's auc: 0.825322	Evaluation's auc: 0.811163
[124]	Train's auc: 0.825435	Evaluation's auc: 0.811227
[125]	Train's auc: 0.825879	Evaluation's auc: 0.81169
[126]	Train's auc: 0.826015	Evaluation's auc: 0.811765
[127]	Train's auc: 0.826091	Evaluation's auc: 0.811803
[128]	Train's auc: 0.826541	Evaluation's auc: 0.812321
[129]	Train's auc: 0.826619	Evaluation's auc: 0.812333
[130]	Train's auc: 0.82678	Evaluation's auc: 0.812438
[131]	Train's auc: 0.826883	Evaluation's auc: 0.812501
[132]	Train's auc: 0.826951	Evaluation's auc: 0.812519
[133]	Train's auc: 0.827356	Evaluation's auc: 0.812977
[134]	Train's auc: 0.827647	Evaluation's auc: 0.813259
[135]	Train's auc: 0.827758	Evaluation's auc: 0.813278
[136]	Train's auc: 0.828101	Evaluation's auc: 0.813627
[137]	Train's auc: 0.828248	Evaluation's auc: 0.813743
[138]	Train's auc: 0.828595	Evaluation's auc: 0.8141
[139]	Train's auc: 0.828853	Evaluation's auc: 0.814331
[140]	Train's auc: 0.828923	Evaluation's auc: 0.814385
[141]	Train's auc: 0.829178	Evaluation's auc: 0.814563
[142]	Train's auc: 0.829275	Evaluation's auc: 0.81466
[143]	Train's auc: 0.829368	Evaluation's auc: 0.814762
[144]	Train's auc: 0.829606	Evaluation's auc: 0.815014
[145]	Train's auc: 0.829688	Evaluation's auc: 0.814969
[146]	Train's auc: 0.829764	Evaluation's auc: 0.815028
[147]	Train's auc: 0.829858	Evaluation's auc: 0.815092
[148]	Train's auc: 0.829933	Evaluation's auc: 0.815077
[149]	Train's auc: 0.830021	Evaluation's auc: 0.815121
[150]	Train's auc: 0.83011	Evaluation's auc: 0.815122
[151]	Train's auc: 0.830407	Evaluation's auc: 0.815439
[152]	Train's auc: 0.830491	Evaluation's auc: 0.815493
[153]	Train's auc: 0.830568	Evaluation's auc: 0.815536
[154]	Train's auc: 0.830867	Evaluation's auc: 0.815857
[155]	Train's auc: 0.830938	Evaluation's auc: 0.815889
[156]	Train's auc: 0.831057	Evaluation's auc: 0.81595
[157]	Train's auc: 0.831134	Evaluation's auc: 0.816022
[158]	Train's auc: 0.831215	Evaluation's auc: 0.8161
[159]	Train's auc: 0.831309	Evaluation's auc: 0.81616
[160]	Train's auc: 0.831376	Evaluation's auc: 0.816147
[161]	Train's auc: 0.83164	Evaluation's auc: 0.816436
[162]	Train's auc: 0.831913	Evaluation's auc: 0.816727
[163]	Train's auc: 0.832006	Evaluation's auc: 0.816767
[164]	Train's auc: 0.832081	Evaluation's auc: 0.816806
[165]	Train's auc: 0.83217	Evaluation's auc: 0.816865
[166]	Train's auc: 0.832241	Evaluation's auc: 0.816925
[167]	Train's auc: 0.832328	Evaluation's auc: 0.817002
[168]	Train's auc: 0.832391	Evaluation's auc: 0.817054
[169]	Train's auc: 0.832466	Evaluation's auc: 0.817115
[170]	Train's auc: 0.832511	Evaluation's auc: 0.817107
[171]	Train's auc: 0.832616	Evaluation's auc: 0.817188
[172]	Train's auc: 0.832851	Evaluation's auc: 0.817413
[173]	Train's auc: 0.832924	Evaluation's auc: 0.817465
[174]	Train's auc: 0.83299	Evaluation's auc: 0.81751
[175]	Train's auc: 0.833064	Evaluation's auc: 0.817551
[176]	Train's auc: 0.833151	Evaluation's auc: 0.817568
[177]	Train's auc: 0.833221	Evaluation's auc: 0.81764
[178]	Train's auc: 0.833295	Evaluation's auc: 0.81768
[179]	Train's auc: 0.833354	Evaluation's auc: 0.817725
[180]	Train's auc: 0.833589	Evaluation's auc: 0.817983
[181]	Train's auc: 0.833676	Evaluation's auc: 0.818012
[182]	Train's auc: 0.8339	Evaluation's auc: 0.818256
[183]	Train's auc: 0.834098	Evaluation's auc: 0.818444
[184]	Train's auc: 0.834173	Evaluation's auc: 0.81846
[185]	Train's auc: 0.834248	Evaluation's auc: 0.818488
[186]	Train's auc: 0.834316	Evaluation's auc: 0.818531
[187]	Train's auc: 0.834392	Evaluation's auc: 0.818608
[188]	Train's auc: 0.834465	Evaluation's auc: 0.818671
[189]	Train's auc: 0.834535	Evaluation's auc: 0.818749
[190]	Train's auc: 0.834746	Evaluation's auc: 0.818981
[191]	Train's auc: 0.834797	Evaluation's auc: 0.819009
[192]	Train's auc: 0.834872	Evaluation's auc: 0.819046
[193]	Train's auc: 0.835065	Evaluation's auc: 0.819254
[194]	Train's auc: 0.835141	Evaluation's auc: 0.819315
[195]	Train's auc: 0.835217	Evaluation's auc: 0.819327
[196]	Train's auc: 0.835292	Evaluation's auc: 0.819366
[197]	Train's auc: 0.835458	Evaluation's auc: 0.819519
[198]	Train's auc: 0.835515	Evaluation's auc: 0.819541
[199]	Train's auc: 0.835565	Evaluation's auc: 0.819568
[200]	Train's auc: 0.835614	Evaluation's auc: 0.819597
[201]	Train's auc: 0.835682	Evaluation's auc: 0.819668
[202]	Train's auc: 0.835757	Evaluation's auc: 0.819684
[203]	Train's auc: 0.835914	Evaluation's auc: 0.819834
[204]	Train's auc: 0.835998	Evaluation's auc: 0.819892
[205]	Train's auc: 0.836097	Evaluation's auc: 0.819978
[206]	Train's auc: 0.836152	Evaluation's auc: 0.820018
[207]	Train's auc: 0.836227	Evaluation's auc: 0.820025
[208]	Train's auc: 0.836272	Evaluation's auc: 0.820064
[209]	Train's auc: 0.836346	Evaluation's auc: 0.820128
[210]	Train's auc: 0.836412	Evaluation's auc: 0.820156
[211]	Train's auc: 0.836461	Evaluation's auc: 0.820176
[212]	Train's auc: 0.836519	Evaluation's auc: 0.820206
[213]	Train's auc: 0.836642	Evaluation's auc: 0.820331
[214]	Train's auc: 0.836711	Evaluation's auc: 0.820361
[215]	Train's auc: 0.836769	Evaluation's auc: 0.820406
[216]	Train's auc: 0.836821	Evaluation's auc: 0.820461
[217]	Train's auc: 0.836883	Evaluation's auc: 0.820503
[218]	Train's auc: 0.836949	Evaluation's auc: 0.820506
[219]	Train's auc: 0.837003	Evaluation's auc: 0.820569
[220]	Train's auc: 0.837053	Evaluation's auc: 0.820596
[221]	Train's auc: 0.837104	Evaluation's auc: 0.820627
[222]	Train's auc: 0.837169	Evaluation's auc: 0.820676
[223]	Train's auc: 0.837235	Evaluation's auc: 0.820711
[224]	Train's auc: 0.837287	Evaluation's auc: 0.820743
[225]	Train's auc: 0.837428	Evaluation's auc: 0.820891
[226]	Train's auc: 0.837508	Evaluation's auc: 0.820945
[227]	Train's auc: 0.837549	Evaluation's auc: 0.820981
[228]	Train's auc: 0.837616	Evaluation's auc: 0.821034
[229]	Train's auc: 0.837663	Evaluation's auc: 0.821063
[230]	Train's auc: 0.837728	Evaluation's auc: 0.821087
[231]	Train's auc: 0.837776	Evaluation's auc: 0.8211
[232]	Train's auc: 0.837823	Evaluation's auc: 0.82114
[233]	Train's auc: 0.837884	Evaluation's auc: 0.821182
[234]	Train's auc: 0.838015	Evaluation's auc: 0.821318
[235]	Train's auc: 0.838054	Evaluation's auc: 0.821345
[236]	Train's auc: 0.838112	Evaluation's auc: 0.821378
[237]	Train's auc: 0.838138	Evaluation's auc: 0.821409
[238]	Train's auc: 0.838182	Evaluation's auc: 0.821433
[239]	Train's auc: 0.838218	Evaluation's auc: 0.82144
[240]	Train's auc: 0.838343	Evaluation's auc: 0.821573
[241]	Train's auc: 0.838382	Evaluation's auc: 0.821595
[242]	Train's auc: 0.838424	Evaluation's auc: 0.821607
[243]	Train's auc: 0.838541	Evaluation's auc: 0.821725
[244]	Train's auc: 0.838682	Evaluation's auc: 0.821833
[245]	Train's auc: 0.838728	Evaluation's auc: 0.821851
[246]	Train's auc: 0.838782	Evaluation's auc: 0.821915
[247]	Train's auc: 0.838831	Evaluation's auc: 0.821954
[248]	Train's auc: 0.838886	Evaluation's auc: 0.821997
[249]	Train's auc: 0.838964	Evaluation's auc: 0.822066
[250]	Train's auc: 0.839023	Evaluation's auc: 0.822129
[251]	Train's auc: 0.839085	Evaluation's auc: 0.822168
[252]	Train's auc: 0.839132	Evaluation's auc: 0.82219
[253]	Train's auc: 0.839175	Evaluation's auc: 0.82222
[254]	Train's auc: 0.839225	Evaluation's auc: 0.822223
[255]	Train's auc: 0.839264	Evaluation's auc: 0.822251
[256]	Train's auc: 0.839377	Evaluation's auc: 0.822366
[257]	Train's auc: 0.839416	Evaluation's auc: 0.822367
[258]	Train's auc: 0.839455	Evaluation's auc: 0.822391
[259]	Train's auc: 0.839517	Evaluation's auc: 0.822453
[260]	Train's auc: 0.839568	Evaluation's auc: 0.822476
[261]	Train's auc: 0.839618	Evaluation's auc: 0.822489
[262]	Train's auc: 0.839737	Evaluation's auc: 0.822598
[263]	Train's auc: 0.839777	Evaluation's auc: 0.822615
[264]	Train's auc: 0.839818	Evaluation's auc: 0.822644
[265]	Train's auc: 0.83989	Evaluation's auc: 0.822698
[266]	Train's auc: 0.839934	Evaluation's auc: 0.822735
[267]	Train's auc: 0.840002	Evaluation's auc: 0.822802
[268]	Train's auc: 0.840049	Evaluation's auc: 0.822841
[269]	Train's auc: 0.840106	Evaluation's auc: 0.82289
[270]	Train's auc: 0.840216	Evaluation's auc: 0.823003
[271]	Train's auc: 0.840258	Evaluation's auc: 0.823027
[272]	Train's auc: 0.840324	Evaluation's auc: 0.823085
[273]	Train's auc: 0.840446	Evaluation's auc: 0.823179
[274]	Train's auc: 0.840487	Evaluation's auc: 0.823205
[275]	Train's auc: 0.840516	Evaluation's auc: 0.823227
[276]	Train's auc: 0.840557	Evaluation's auc: 0.823237
[277]	Train's auc: 0.840613	Evaluation's auc: 0.823255
[278]	Train's auc: 0.84065	Evaluation's auc: 0.823273
[279]	Train's auc: 0.840786	Evaluation's auc: 0.823398
[280]	Train's auc: 0.840845	Evaluation's auc: 0.823464
[281]	Train's auc: 0.840884	Evaluation's auc: 0.82348
[282]	Train's auc: 0.840981	Evaluation's auc: 0.823579
[283]	Train's auc: 0.841026	Evaluation's auc: 0.823623
[284]	Train's auc: 0.841077	Evaluation's auc: 0.823669
[285]	Train's auc: 0.841122	Evaluation's auc: 0.82369
[286]	Train's auc: 0.841216	Evaluation's auc: 0.823765
[287]	Train's auc: 0.841256	Evaluation's auc: 0.823781
[288]	Train's auc: 0.841309	Evaluation's auc: 0.823827
[289]	Train's auc: 0.841395	Evaluation's auc: 0.823907
[290]	Train's auc: 0.841435	Evaluation's auc: 0.823918
[291]	Train's auc: 0.841473	Evaluation's auc: 0.823941
[292]	Train's auc: 0.841512	Evaluation's auc: 0.823961
[293]	Train's auc: 0.841565	Evaluation's auc: 0.824003
[294]	Train's auc: 0.841607	Evaluation's auc: 0.824039
[295]	Train's auc: 0.84164	Evaluation's auc: 0.824047
[296]	Train's auc: 0.841731	Evaluation's auc: 0.82414
[297]	Train's auc: 0.841777	Evaluation's auc: 0.824153
[298]	Train's auc: 0.841847	Evaluation's auc: 0.824179
[299]	Train's auc: 0.841897	Evaluation's auc: 0.824227
[300]	Train's auc: 0.841935	Evaluation's auc: 0.824245
[301]	Train's auc: 0.842022	Evaluation's auc: 0.824313
[302]	Train's auc: 0.842068	Evaluation's auc: 0.824355
[303]	Train's auc: 0.842115	Evaluation's auc: 0.824372
[304]	Train's auc: 0.842154	Evaluation's auc: 0.824387
[305]	Train's auc: 0.842182	Evaluation's auc: 0.824393
[306]	Train's auc: 0.842273	Evaluation's auc: 0.824473
[307]	Train's auc: 0.842369	Evaluation's auc: 0.82453
[308]	Train's auc: 0.842409	Evaluation's auc: 0.824573
[309]	Train's auc: 0.842449	Evaluation's auc: 0.82459
[310]	Train's auc: 0.842484	Evaluation's auc: 0.824617
[311]	Train's auc: 0.842532	Evaluation's auc: 0.824664
[312]	Train's auc: 0.842577	Evaluation's auc: 0.824683
[313]	Train's auc: 0.842617	Evaluation's auc: 0.824682
[314]	Train's auc: 0.842665	Evaluation's auc: 0.824723
[315]	Train's auc: 0.842696	Evaluation's auc: 0.824735
[316]	Train's auc: 0.842783	Evaluation's auc: 0.824806
[317]	Train's auc: 0.842845	Evaluation's auc: 0.824854
[318]	Train's auc: 0.8429	Evaluation's auc: 0.824894
[319]	Train's auc: 0.842979	Evaluation's auc: 0.824941
[320]	Train's auc: 0.843008	Evaluation's auc: 0.82496
[321]	Train's auc: 0.843053	Evaluation's auc: 0.824987
[322]	Train's auc: 0.843093	Evaluation's auc: 0.825016
[323]	Train's auc: 0.843142	Evaluation's auc: 0.825061
[324]	Train's auc: 0.84322	Evaluation's auc: 0.825112
[325]	Train's auc: 0.843269	Evaluation's auc: 0.825135
[326]	Train's auc: 0.843315	Evaluation's auc: 0.825161
[327]	Train's auc: 0.843371	Evaluation's auc: 0.825187
[328]	Train's auc: 0.843438	Evaluation's auc: 0.825263
[329]	Train's auc: 0.843473	Evaluation's auc: 0.825309
[330]	Train's auc: 0.843516	Evaluation's auc: 0.825347
[331]	Train's auc: 0.843556	Evaluation's auc: 0.825375
[332]	Train's auc: 0.843618	Evaluation's auc: 0.825403
[333]	Train's auc: 0.843695	Evaluation's auc: 0.825454
[334]	Train's auc: 0.843737	Evaluation's auc: 0.825476
[335]	Train's auc: 0.843776	Evaluation's auc: 0.825501
[336]	Train's auc: 0.843819	Evaluation's auc: 0.825514
[337]	Train's auc: 0.843861	Evaluation's auc: 0.825529
[338]	Train's auc: 0.843888	Evaluation's auc: 0.825541
[339]	Train's auc: 0.843924	Evaluation's auc: 0.825573
[340]	Train's auc: 0.843952	Evaluation's auc: 0.825581
[341]	Train's auc: 0.844016	Evaluation's auc: 0.825612
[342]	Train's auc: 0.844051	Evaluation's auc: 0.825645
[343]	Train's auc: 0.844095	Evaluation's auc: 0.825667
[344]	Train's auc: 0.84412	Evaluation's auc: 0.825687
[345]	Train's auc: 0.844164	Evaluation's auc: 0.825719
[346]	Train's auc: 0.844234	Evaluation's auc: 0.825792
[347]	Train's auc: 0.844297	Evaluation's auc: 0.825858
[348]	Train's auc: 0.844359	Evaluation's auc: 0.825895
[349]	Train's auc: 0.844405	Evaluation's auc: 0.825935
[350]	Train's auc: 0.844444	Evaluation's auc: 0.825973
[351]	Train's auc: 0.844488	Evaluation's auc: 0.825976
[352]	Train's auc: 0.844515	Evaluation's auc: 0.825975
[353]	Train's auc: 0.844552	Evaluation's auc: 0.826
[354]	Train's auc: 0.844598	Evaluation's auc: 0.82601
[355]	Train's auc: 0.844633	Evaluation's auc: 0.826045
[356]	Train's auc: 0.84467	Evaluation's auc: 0.826077
[357]	Train's auc: 0.844705	Evaluation's auc: 0.826096
[358]	Train's auc: 0.844764	Evaluation's auc: 0.826126
[359]	Train's auc: 0.844824	Evaluation's auc: 0.826182
[360]	Train's auc: 0.844849	Evaluation's auc: 0.826199
[361]	Train's auc: 0.844884	Evaluation's auc: 0.826194
[362]	Train's auc: 0.844932	Evaluation's auc: 0.826238
[363]	Train's auc: 0.844963	Evaluation's auc: 0.826247
[364]	Train's auc: 0.845015	Evaluation's auc: 0.826277
[365]	Train's auc: 0.845078	Evaluation's auc: 0.826343
[366]	Train's auc: 0.845135	Evaluation's auc: 0.826373
[367]	Train's auc: 0.845165	Evaluation's auc: 0.826385
[368]	Train's auc: 0.845206	Evaluation's auc: 0.826393
[369]	Train's auc: 0.845234	Evaluation's auc: 0.826406
[370]	Train's auc: 0.845273	Evaluation's auc: 0.826444
[371]	Train's auc: 0.845312	Evaluation's auc: 0.826468
[372]	Train's auc: 0.845377	Evaluation's auc: 0.826541
[373]	Train's auc: 0.845423	Evaluation's auc: 0.826588
[374]	Train's auc: 0.845457	Evaluation's auc: 0.826624
[375]	Train's auc: 0.845492	Evaluation's auc: 0.826628
[376]	Train's auc: 0.845526	Evaluation's auc: 0.826643
[377]	Train's auc: 0.845562	Evaluation's auc: 0.826638
[378]	Train's auc: 0.845593	Evaluation's auc: 0.826653
[379]	Train's auc: 0.845644	Evaluation's auc: 0.826692
[380]	Train's auc: 0.845688	Evaluation's auc: 0.826738
[381]	Train's auc: 0.845736	Evaluation's auc: 0.826765
[382]	Train's auc: 0.845772	Evaluation's auc: 0.826776
[383]	Train's auc: 0.845822	Evaluation's auc: 0.82677
[384]	Train's auc: 0.845852	Evaluation's auc: 0.826772
[385]	Train's auc: 0.845879	Evaluation's auc: 0.826786
[386]	Train's auc: 0.845903	Evaluation's auc: 0.82681
[387]	Train's auc: 0.845942	Evaluation's auc: 0.826845
[388]	Train's auc: 0.845994	Evaluation's auc: 0.8269
[389]	Train's auc: 0.846031	Evaluation's auc: 0.826905
[390]	Train's auc: 0.846068	Evaluation's auc: 0.826926
[391]	Train's auc: 0.846124	Evaluation's auc: 0.82699
[392]	Train's auc: 0.84616	Evaluation's auc: 0.827006
[393]	Train's auc: 0.846193	Evaluation's auc: 0.827026
[394]	Train's auc: 0.846218	Evaluation's auc: 0.827044
[395]	Train's auc: 0.846253	Evaluation's auc: 0.827069
[396]	Train's auc: 0.846266	Evaluation's auc: 0.827066
[397]	Train's auc: 0.846322	Evaluation's auc: 0.827098
[398]	Train's auc: 0.846359	Evaluation's auc: 0.827118
[399]	Train's auc: 0.846388	Evaluation's auc: 0.827122
[400]	Train's auc: 0.846425	Evaluation's auc: 0.827162
[401]	Train's auc: 0.846455	Evaluation's auc: 0.827169
[402]	Train's auc: 0.84649	Evaluation's auc: 0.827172
[403]	Train's auc: 0.846536	Evaluation's auc: 0.827201
[404]	Train's auc: 0.846564	Evaluation's auc: 0.827204
[405]	Train's auc: 0.846606	Evaluation's auc: 0.827231
[406]	Train's auc: 0.84664	Evaluation's auc: 0.827246
[407]	Train's auc: 0.846688	Evaluation's auc: 0.82728
[408]	Train's auc: 0.846723	Evaluation's auc: 0.827313
[409]	Train's auc: 0.846751	Evaluation's auc: 0.827327
[410]	Train's auc: 0.846789	Evaluation's auc: 0.827364
[411]	Train's auc: 0.846828	Evaluation's auc: 0.827383
[412]	Train's auc: 0.846872	Evaluation's auc: 0.827407
[413]	Train's auc: 0.846902	Evaluation's auc: 0.827408
[414]	Train's auc: 0.846956	Evaluation's auc: 0.827452
[415]	Train's auc: 0.846998	Evaluation's auc: 0.827461
[416]	Train's auc: 0.847039	Evaluation's auc: 0.82748
[417]	Train's auc: 0.847072	Evaluation's auc: 0.827486
[418]	Train's auc: 0.847107	Evaluation's auc: 0.827499
[419]	Train's auc: 0.847133	Evaluation's auc: 0.82751
[420]	Train's auc: 0.847195	Evaluation's auc: 0.827545
[421]	Train's auc: 0.847218	Evaluation's auc: 0.827564
[422]	Train's auc: 0.847257	Evaluation's auc: 0.827585
[423]	Train's auc: 0.847296	Evaluation's auc: 0.82758
[424]	Train's auc: 0.847324	Evaluation's auc: 0.827611
[425]	Train's auc: 0.847351	Evaluation's auc: 0.827613
[426]	Train's auc: 0.847396	Evaluation's auc: 0.827641
[427]	Train's auc: 0.847437	Evaluation's auc: 0.827672
[428]	Train's auc: 0.847467	Evaluation's auc: 0.827671
[429]	Train's auc: 0.847509	Evaluation's auc: 0.827709
[430]	Train's auc: 0.847557	Evaluation's auc: 0.827735
[431]	Train's auc: 0.847594	Evaluation's auc: 0.827749
[432]	Train's auc: 0.847627	Evaluation's auc: 0.827768
[433]	Train's auc: 0.847662	Evaluation's auc: 0.827762
[434]	Train's auc: 0.84769	Evaluation's auc: 0.827774
[435]	Train's auc: 0.847728	Evaluation's auc: 0.827793
[436]	Train's auc: 0.847765	Evaluation's auc: 0.827828
[437]	Train's auc: 0.847793	Evaluation's auc: 0.827847
[438]	Train's auc: 0.847848	Evaluation's auc: 0.827904
[439]	Train's auc: 0.847897	Evaluation's auc: 0.827934
[440]	Train's auc: 0.847932	Evaluation's auc: 0.827943
[441]	Train's auc: 0.847992	Evaluation's auc: 0.827981
[442]	Train's auc: 0.848017	Evaluation's auc: 0.827985
[443]	Train's auc: 0.848044	Evaluation's auc: 0.827995
[444]	Train's auc: 0.848089	Evaluation's auc: 0.828022
[445]	Train's auc: 0.848119	Evaluation's auc: 0.828039
[446]	Train's auc: 0.848145	Evaluation's auc: 0.828051
[447]	Train's auc: 0.848174	Evaluation's auc: 0.828068
[448]	Train's auc: 0.848207	Evaluation's auc: 0.828097
[449]	Train's auc: 0.848223	Evaluation's auc: 0.828106
[450]	Train's auc: 0.848255	Evaluation's auc: 0.828112
[451]	Train's auc: 0.848281	Evaluation's auc: 0.828134
[452]	Train's auc: 0.848311	Evaluation's auc: 0.828162
[453]	Train's auc: 0.848341	Evaluation's auc: 0.828175
[454]	Train's auc: 0.848369	Evaluation's auc: 0.828181
[455]	Train's auc: 0.848399	Evaluation's auc: 0.828182
[456]	Train's auc: 0.848422	Evaluation's auc: 0.828187
[457]	Train's auc: 0.848453	Evaluation's auc: 0.82819
[458]	Train's auc: 0.848516	Evaluation's auc: 0.828245
[459]	Train's auc: 0.848547	Evaluation's auc: 0.828267
[460]	Train's auc: 0.848579	Evaluation's auc: 0.828271
[461]	Train's auc: 0.848626	Evaluation's auc: 0.828328
[462]	Train's auc: 0.848661	Evaluation's auc: 0.828333
[463]	Train's auc: 0.848683	Evaluation's auc: 0.828336
[464]	Train's auc: 0.848727	Evaluation's auc: 0.828345
[465]	Train's auc: 0.848757	Evaluation's auc: 0.828375
[466]	Train's auc: 0.848793	Evaluation's auc: 0.828393
[467]	Train's auc: 0.848833	Evaluation's auc: 0.828424
[468]	Train's auc: 0.848865	Evaluation's auc: 0.828442
[469]	Train's auc: 0.84889	Evaluation's auc: 0.828438
[470]	Train's auc: 0.848935	Evaluation's auc: 0.82847
[471]	Train's auc: 0.848957	Evaluation's auc: 0.828471
[472]	Train's auc: 0.848987	Evaluation's auc: 0.828504
[473]	Train's auc: 0.849028	Evaluation's auc: 0.828541
[474]	Train's auc: 0.849074	Evaluation's auc: 0.8286
[475]	Train's auc: 0.849115	Evaluation's auc: 0.82863
[476]	Train's auc: 0.849147	Evaluation's auc: 0.828648
[477]	Train's auc: 0.849178	Evaluation's auc: 0.828659
[478]	Train's auc: 0.849211	Evaluation's auc: 0.828677
[479]	Train's auc: 0.849237	Evaluation's auc: 0.828682
[480]	Train's auc: 0.849269	Evaluation's auc: 0.828689
[481]	Train's auc: 0.8493	Evaluation's auc: 0.828727
[482]	Train's auc: 0.849334	Evaluation's auc: 0.828743
[483]	Train's auc: 0.84935	Evaluation's auc: 0.828737
[484]	Train's auc: 0.849396	Evaluation's auc: 0.82877
[485]	Train's auc: 0.849437	Evaluation's auc: 0.828804
[486]	Train's auc: 0.849464	Evaluation's auc: 0.828821
[487]	Train's auc: 0.849504	Evaluation's auc: 0.828844
[488]	Train's auc: 0.849534	Evaluation's auc: 0.828865
[489]	Train's auc: 0.849568	Evaluation's auc: 0.828899
[490]	Train's auc: 0.849589	Evaluation's auc: 0.82891
[491]	Train's auc: 0.849627	Evaluation's auc: 0.828943
[492]	Train's auc: 0.849647	Evaluation's auc: 0.828951
[493]	Train's auc: 0.849708	Evaluation's auc: 0.829001
[494]	Train's auc: 0.849762	Evaluation's auc: 0.829049
[495]	Train's auc: 0.849785	Evaluation's auc: 0.829059
[496]	Train's auc: 0.849839	Evaluation's auc: 0.829109
[497]	Train's auc: 0.849878	Evaluation's auc: 0.829113
[498]	Train's auc: 0.849912	Evaluation's auc: 0.829131
[499]	Train's auc: 0.849952	Evaluation's auc: 0.829137
[500]	Train's auc: 0.84999	Evaluation's auc: 0.829144
[501]	Train's auc: 0.85002	Evaluation's auc: 0.829148
[502]	Train's auc: 0.85006	Evaluation's auc: 0.829177
[503]	Train's auc: 0.850094	Evaluation's auc: 0.82918
[504]	Train's auc: 0.85013	Evaluation's auc: 0.829196
[505]	Train's auc: 0.850171	Evaluation's auc: 0.829201
[506]	Train's auc: 0.850204	Evaluation's auc: 0.82923
[507]	Train's auc: 0.85024	Evaluation's auc: 0.829255
[508]	Train's auc: 0.850274	Evaluation's auc: 0.829248
[509]	Train's auc: 0.850306	Evaluation's auc: 0.829278
[510]	Train's auc: 0.85034	Evaluation's auc: 0.829292
[511]	Train's auc: 0.850385	Evaluation's auc: 0.829324
[512]	Train's auc: 0.850405	Evaluation's auc: 0.829325
[513]	Train's auc: 0.850443	Evaluation's auc: 0.829343
[514]	Train's auc: 0.850489	Evaluation's auc: 0.829385
[515]	Train's auc: 0.850525	Evaluation's auc: 0.829389
[516]	Train's auc: 0.850573	Evaluation's auc: 0.829425
[517]	Train's auc: 0.850602	Evaluation's auc: 0.82944
[518]	Train's auc: 0.850634	Evaluation's auc: 0.829465
[519]	Train's auc: 0.850675	Evaluation's auc: 0.829487
[520]	Train's auc: 0.850698	Evaluation's auc: 0.82951
[521]	Train's auc: 0.850729	Evaluation's auc: 0.829516
[522]	Train's auc: 0.850778	Evaluation's auc: 0.829539
[523]	Train's auc: 0.85081	Evaluation's auc: 0.829541
[524]	Train's auc: 0.850849	Evaluation's auc: 0.829569
[525]	Train's auc: 0.850881	Evaluation's auc: 0.829582
[526]	Train's auc: 0.850904	Evaluation's auc: 0.829602
[527]	Train's auc: 0.850939	Evaluation's auc: 0.82961
[528]	Train's auc: 0.850965	Evaluation's auc: 0.829613
[529]	Train's auc: 0.851014	Evaluation's auc: 0.829654
[530]	Train's auc: 0.851051	Evaluation's auc: 0.829668
[531]	Train's auc: 0.851069	Evaluation's auc: 0.82967
[532]	Train's auc: 0.851104	Evaluation's auc: 0.82967
[533]	Train's auc: 0.851133	Evaluation's auc: 0.829693
[534]	Train's auc: 0.851149	Evaluation's auc: 0.829687
[535]	Train's auc: 0.85119	Evaluation's auc: 0.829728
[536]	Train's auc: 0.851221	Evaluation's auc: 0.829737
[537]	Train's auc: 0.851256	Evaluation's auc: 0.829763
[538]	Train's auc: 0.851278	Evaluation's auc: 0.829765
[539]	Train's auc: 0.851309	Evaluation's auc: 0.829779
[540]	Train's auc: 0.851351	Evaluation's auc: 0.829815
[541]	Train's auc: 0.851402	Evaluation's auc: 0.829854
[542]	Train's auc: 0.851426	Evaluation's auc: 0.829854
[543]	Train's auc: 0.851452	Evaluation's auc: 0.829856
[544]	Train's auc: 0.851489	Evaluation's auc: 0.829887
[545]	Train's auc: 0.851526	Evaluation's auc: 0.829922
[546]	Train's auc: 0.851563	Evaluation's auc: 0.829952
[547]	Train's auc: 0.851617	Evaluation's auc: 0.830004
[548]	Train's auc: 0.851643	Evaluation's auc: 0.83
[549]	Train's auc: 0.85166	Evaluation's auc: 0.829998
[550]	Train's auc: 0.851686	Evaluation's auc: 0.830001
[551]	Train's auc: 0.851727	Evaluation's auc: 0.830029
[552]	Train's auc: 0.851765	Evaluation's auc: 0.830061
[553]	Train's auc: 0.851807	Evaluation's auc: 0.830095
[554]	Train's auc: 0.851839	Evaluation's auc: 0.830102
[555]	Train's auc: 0.851869	Evaluation's auc: 0.83011
[556]	Train's auc: 0.85191	Evaluation's auc: 0.830141
[557]	Train's auc: 0.851938	Evaluation's auc: 0.83014
[558]	Train's auc: 0.851978	Evaluation's auc: 0.830153
[559]	Train's auc: 0.852013	Evaluation's auc: 0.830165
[560]	Train's auc: 0.852033	Evaluation's auc: 0.830181
[561]	Train's auc: 0.852088	Evaluation's auc: 0.830233
[562]	Train's auc: 0.852113	Evaluation's auc: 0.830239
[563]	Train's auc: 0.852136	Evaluation's auc: 0.830243
[564]	Train's auc: 0.852155	Evaluation's auc: 0.830259
[565]	Train's auc: 0.852185	Evaluation's auc: 0.830277
[566]	Train's auc: 0.852213	Evaluation's auc: 0.830293
[567]	Train's auc: 0.852239	Evaluation's auc: 0.830302
[568]	Train's auc: 0.852267	Evaluation's auc: 0.830329
[569]	Train's auc: 0.852307	Evaluation's auc: 0.830341
[570]	Train's auc: 0.852336	Evaluation's auc: 0.830349
[571]	Train's auc: 0.852373	Evaluation's auc: 0.830364
[572]	Train's auc: 0.852392	Evaluation's auc: 0.830374
[573]	Train's auc: 0.852418	Evaluation's auc: 0.830384
[574]	Train's auc: 0.852463	Evaluation's auc: 0.830395
[575]	Train's auc: 0.852482	Evaluation's auc: 0.830403
[576]	Train's auc: 0.852528	Evaluation's auc: 0.830418
[577]	Train's auc: 0.852559	Evaluation's auc: 0.830424
[578]	Train's auc: 0.852584	Evaluation's auc: 0.830417
[579]	Train's auc: 0.852616	Evaluation's auc: 0.830438
[580]	Train's auc: 0.852638	Evaluation's auc: 0.83045
[581]	Train's auc: 0.852671	Evaluation's auc: 0.830481
[582]	Train's auc: 0.85269	Evaluation's auc: 0.830489
[583]	Train's auc: 0.852713	Evaluation's auc: 0.830514
[584]	Train's auc: 0.852744	Evaluation's auc: 0.830529
[585]	Train's auc: 0.852784	Evaluation's auc: 0.830576
[586]	Train's auc: 0.85281	Evaluation's auc: 0.830591
[587]	Train's auc: 0.852837	Evaluation's auc: 0.830595
[588]	Train's auc: 0.852869	Evaluation's auc: 0.83061
[589]	Train's auc: 0.852906	Evaluation's auc: 0.830644
[590]	Train's auc: 0.85294	Evaluation's auc: 0.830636
[591]	Train's auc: 0.852958	Evaluation's auc: 0.830622
[592]	Train's auc: 0.853001	Evaluation's auc: 0.830657
[593]	Train's auc: 0.853022	Evaluation's auc: 0.830641
[594]	Train's auc: 0.853044	Evaluation's auc: 0.830635
[595]	Train's auc: 0.853075	Evaluation's auc: 0.830653
[596]	Train's auc: 0.853096	Evaluation's auc: 0.830665
[597]	Train's auc: 0.853118	Evaluation's auc: 0.830666
[598]	Train's auc: 0.853142	Evaluation's auc: 0.830666
[599]	Train's auc: 0.853167	Evaluation's auc: 0.830674
[600]	Train's auc: 0.853195	Evaluation's auc: 0.830689
[601]	Train's auc: 0.853233	Evaluation's auc: 0.830708
[602]	Train's auc: 0.853265	Evaluation's auc: 0.830711
[603]	Train's auc: 0.853299	Evaluation's auc: 0.830746
[604]	Train's auc: 0.85333	Evaluation's auc: 0.830749
[605]	Train's auc: 0.853367	Evaluation's auc: 0.830772
[606]	Train's auc: 0.853421	Evaluation's auc: 0.83081
[607]	Train's auc: 0.853447	Evaluation's auc: 0.830822
[608]	Train's auc: 0.853475	Evaluation's auc: 0.830828
[609]	Train's auc: 0.853515	Evaluation's auc: 0.830837
[610]	Train's auc: 0.853548	Evaluation's auc: 0.830849
[611]	Train's auc: 0.853576	Evaluation's auc: 0.830872
[612]	Train's auc: 0.853609	Evaluation's auc: 0.830903
[613]	Train's auc: 0.853631	Evaluation's auc: 0.830906
[614]	Train's auc: 0.853657	Evaluation's auc: 0.830895
[615]	Train's auc: 0.853683	Evaluation's auc: 0.830904
[616]	Train's auc: 0.853724	Evaluation's auc: 0.830954
[617]	Train's auc: 0.853736	Evaluation's auc: 0.830955
[618]	Train's auc: 0.853758	Evaluation's auc: 0.830949
[619]	Train's auc: 0.853786	Evaluation's auc: 0.830956
[620]	Train's auc: 0.853816	Evaluation's auc: 0.830955
[621]	Train's auc: 0.853855	Evaluation's auc: 0.831002
[622]	Train's auc: 0.853874	Evaluation's auc: 0.831011
[623]	Train's auc: 0.853906	Evaluation's auc: 0.831023
[624]	Train's auc: 0.85394	Evaluation's auc: 0.831028
[625]	Train's auc: 0.853966	Evaluation's auc: 0.831023
[626]	Train's auc: 0.854002	Evaluation's auc: 0.831047
[627]	Train's auc: 0.854028	Evaluation's auc: 0.831053
[628]	Train's auc: 0.854064	Evaluation's auc: 0.831071
[629]	Train's auc: 0.854097	Evaluation's auc: 0.831121
[630]	Train's auc: 0.854131	Evaluation's auc: 0.831128
[631]	Train's auc: 0.854149	Evaluation's auc: 0.831125
[632]	Train's auc: 0.854187	Evaluation's auc: 0.831153
[633]	Train's auc: 0.854231	Evaluation's auc: 0.831189
[634]	Train's auc: 0.854272	Evaluation's auc: 0.831237
[635]	Train's auc: 0.854299	Evaluation's auc: 0.831236
[636]	Train's auc: 0.854319	Evaluation's auc: 0.831235
[637]	Train's auc: 0.854351	Evaluation's auc: 0.831244
[638]	Train's auc: 0.854382	Evaluation's auc: 0.831273
[639]	Train's auc: 0.854414	Evaluation's auc: 0.831297
[640]	Train's auc: 0.854444	Evaluation's auc: 0.831315
[641]	Train's auc: 0.854467	Evaluation's auc: 0.831322
[642]	Train's auc: 0.854505	Evaluation's auc: 0.831355
[643]	Train's auc: 0.854523	Evaluation's auc: 0.83136
[644]	Train's auc: 0.854552	Evaluation's auc: 0.831377
[645]	Train's auc: 0.85458	Evaluation's auc: 0.831383
[646]	Train's auc: 0.854606	Evaluation's auc: 0.831379
[647]	Train's auc: 0.854634	Evaluation's auc: 0.831421
[648]	Train's auc: 0.854673	Evaluation's auc: 0.831439
[649]	Train's auc: 0.854697	Evaluation's auc: 0.831463
[650]	Train's auc: 0.854729	Evaluation's auc: 0.831478
[651]	Train's auc: 0.854751	Evaluation's auc: 0.831484
[652]	Train's auc: 0.854793	Evaluation's auc: 0.831524
[653]	Train's auc: 0.85482	Evaluation's auc: 0.831541
[654]	Train's auc: 0.854849	Evaluation's auc: 0.831547
[655]	Train's auc: 0.854882	Evaluation's auc: 0.831532
[656]	Train's auc: 0.854915	Evaluation's auc: 0.831537
[657]	Train's auc: 0.854948	Evaluation's auc: 0.831543
[658]	Train's auc: 0.854971	Evaluation's auc: 0.831543
[659]	Train's auc: 0.855012	Evaluation's auc: 0.831578
[660]	Train's auc: 0.855043	Evaluation's auc: 0.831609
[661]	Train's auc: 0.855065	Evaluation's auc: 0.831605
[662]	Train's auc: 0.855092	Evaluation's auc: 0.831609
[663]	Train's auc: 0.855116	Evaluation's auc: 0.831616
[664]	Train's auc: 0.855136	Evaluation's auc: 0.831618
[665]	Train's auc: 0.855152	Evaluation's auc: 0.831612
[666]	Train's auc: 0.855187	Evaluation's auc: 0.831645
[667]	Train's auc: 0.855218	Evaluation's auc: 0.831648
[668]	Train's auc: 0.855237	Evaluation's auc: 0.83165
[669]	Train's auc: 0.855259	Evaluation's auc: 0.831674
[670]	Train's auc: 0.855284	Evaluation's auc: 0.831682
[671]	Train's auc: 0.85531	Evaluation's auc: 0.831696
[672]	Train's auc: 0.855345	Evaluation's auc: 0.831709
[673]	Train's auc: 0.855368	Evaluation's auc: 0.831724
[674]	Train's auc: 0.8554	Evaluation's auc: 0.831764
[675]	Train's auc: 0.855435	Evaluation's auc: 0.831799
[676]	Train's auc: 0.85546	Evaluation's auc: 0.831815
[677]	Train's auc: 0.855481	Evaluation's auc: 0.831803
[678]	Train's auc: 0.855505	Evaluation's auc: 0.831803
[679]	Train's auc: 0.855523	Evaluation's auc: 0.831798
[680]	Train's auc: 0.855553	Evaluation's auc: 0.831815
[681]	Train's auc: 0.855588	Evaluation's auc: 0.831832
[682]	Train's auc: 0.855621	Evaluation's auc: 0.83186
[683]	Train's auc: 0.855641	Evaluation's auc: 0.831869
[684]	Train's auc: 0.85567	Evaluation's auc: 0.831904
[685]	Train's auc: 0.855699	Evaluation's auc: 0.83192
[686]	Train's auc: 0.855722	Evaluation's auc: 0.831941
[687]	Train's auc: 0.85574	Evaluation's auc: 0.831942
[688]	Train's auc: 0.855774	Evaluation's auc: 0.831971
[689]	Train's auc: 0.855788	Evaluation's auc: 0.831981
[690]	Train's auc: 0.855814	Evaluation's auc: 0.831991
[691]	Train's auc: 0.855837	Evaluation's auc: 0.832016
[692]	Train's auc: 0.85586	Evaluation's auc: 0.832017
[693]	Train's auc: 0.855885	Evaluation's auc: 0.83204
[694]	Train's auc: 0.855903	Evaluation's auc: 0.832047
[695]	Train's auc: 0.855934	Evaluation's auc: 0.832041
[696]	Train's auc: 0.855953	Evaluation's auc: 0.832049
[697]	Train's auc: 0.855978	Evaluation's auc: 0.832065
[698]	Train's auc: 0.855996	Evaluation's auc: 0.832061
[699]	Train's auc: 0.856032	Evaluation's auc: 0.832085
[700]	Train's auc: 0.856049	Evaluation's auc: 0.832095
[701]	Train's auc: 0.85608	Evaluation's auc: 0.832106
[702]	Train's auc: 0.856105	Evaluation's auc: 0.832105
[703]	Train's auc: 0.856126	Evaluation's auc: 0.832101
[704]	Train's auc: 0.856153	Evaluation's auc: 0.832109
[705]	Train's auc: 0.856179	Evaluation's auc: 0.832144
[706]	Train's auc: 0.856215	Evaluation's auc: 0.832171
[707]	Train's auc: 0.856231	Evaluation's auc: 0.832181
[708]	Train's auc: 0.856247	Evaluation's auc: 0.832179
[709]	Train's auc: 0.856277	Evaluation's auc: 0.83219
[710]	Train's auc: 0.856305	Evaluation's auc: 0.832208
[711]	Train's auc: 0.85634	Evaluation's auc: 0.832236
[712]	Train's auc: 0.856368	Evaluation's auc: 0.832229
[713]	Train's auc: 0.856398	Evaluation's auc: 0.83223
[714]	Train's auc: 0.856418	Evaluation's auc: 0.832261
[715]	Train's auc: 0.85644	Evaluation's auc: 0.832278
[716]	Train's auc: 0.856474	Evaluation's auc: 0.832292
[717]	Train's auc: 0.856494	Evaluation's auc: 0.832282
[718]	Train's auc: 0.856526	Evaluation's auc: 0.83228
[719]	Train's auc: 0.856551	Evaluation's auc: 0.832281
[720]	Train's auc: 0.856574	Evaluation's auc: 0.832284
[721]	Train's auc: 0.85659	Evaluation's auc: 0.832281
[722]	Train's auc: 0.856622	Evaluation's auc: 0.832304
[723]	Train's auc: 0.856636	Evaluation's auc: 0.8323
[724]	Train's auc: 0.856651	Evaluation's auc: 0.832302
[725]	Train's auc: 0.856679	Evaluation's auc: 0.832315
[726]	Train's auc: 0.856694	Evaluation's auc: 0.832323
[727]	Train's auc: 0.85673	Evaluation's auc: 0.832365
[728]	Train's auc: 0.856755	Evaluation's auc: 0.832365
[729]	Train's auc: 0.856772	Evaluation's auc: 0.832349
[730]	Train's auc: 0.856797	Evaluation's auc: 0.832357
[731]	Train's auc: 0.856829	Evaluation's auc: 0.832394
[732]	Train's auc: 0.856846	Evaluation's auc: 0.832404
[733]	Train's auc: 0.856871	Evaluation's auc: 0.832425
[734]	Train's auc: 0.856897	Evaluation's auc: 0.832419
[735]	Train's auc: 0.856922	Evaluation's auc: 0.832431
[736]	Train's auc: 0.856951	Evaluation's auc: 0.832457
[737]	Train's auc: 0.856974	Evaluation's auc: 0.832469
[738]	Train's auc: 0.857003	Evaluation's auc: 0.832496
[739]	Train's auc: 0.857031	Evaluation's auc: 0.832518
[740]	Train's auc: 0.857059	Evaluation's auc: 0.832523
[741]	Train's auc: 0.857097	Evaluation's auc: 0.832558
[742]	Train's auc: 0.857124	Evaluation's auc: 0.832564
[743]	Train's auc: 0.857145	Evaluation's auc: 0.832581
[744]	Train's auc: 0.857169	Evaluation's auc: 0.832576
[745]	Train's auc: 0.857182	Evaluation's auc: 0.832579
[746]	Train's auc: 0.85722	Evaluation's auc: 0.832577
[747]	Train's auc: 0.857238	Evaluation's auc: 0.832569
[748]	Train's auc: 0.857262	Evaluation's auc: 0.832582
[749]	Train's auc: 0.857279	Evaluation's auc: 0.832595
[750]	Train's auc: 0.857303	Evaluation's auc: 0.832618
[751]	Train's auc: 0.857325	Evaluation's auc: 0.832623
[752]	Train's auc: 0.85735	Evaluation's auc: 0.832657
[753]	Train's auc: 0.85737	Evaluation's auc: 0.832667
[754]	Train's auc: 0.857392	Evaluation's auc: 0.832683
[755]	Train's auc: 0.857409	Evaluation's auc: 0.832677
[756]	Train's auc: 0.857437	Evaluation's auc: 0.832681
[757]	Train's auc: 0.857458	Evaluation's auc: 0.832684
[758]	Train's auc: 0.85749	Evaluation's auc: 0.83267
[759]	Train's auc: 0.857503	Evaluation's auc: 0.832683
[760]	Train's auc: 0.857516	Evaluation's auc: 0.83267
[761]	Train's auc: 0.857543	Evaluation's auc: 0.832674
[762]	Train's auc: 0.857549	Evaluation's auc: 0.832676
[763]	Train's auc: 0.857577	Evaluation's auc: 0.832706
[764]	Train's auc: 0.857605	Evaluation's auc: 0.832731
[765]	Train's auc: 0.857622	Evaluation's auc: 0.832746
[766]	Train's auc: 0.857651	Evaluation's auc: 0.832759
[767]	Train's auc: 0.85767	Evaluation's auc: 0.832759
[768]	Train's auc: 0.857691	Evaluation's auc: 0.832764
[769]	Train's auc: 0.857714	Evaluation's auc: 0.832771
[770]	Train's auc: 0.85773	Evaluation's auc: 0.832765
[771]	Train's auc: 0.85775	Evaluation's auc: 0.832771
[772]	Train's auc: 0.857769	Evaluation's auc: 0.832768
[773]	Train's auc: 0.85778	Evaluation's auc: 0.832764
[774]	Train's auc: 0.857811	Evaluation's auc: 0.83279
[775]	Train's auc: 0.857834	Evaluation's auc: 0.832806
[776]	Train's auc: 0.857854	Evaluation's auc: 0.832833
[777]	Train's auc: 0.857879	Evaluation's auc: 0.83285
[778]	Train's auc: 0.857901	Evaluation's auc: 0.832858
[779]	Train's auc: 0.857917	Evaluation's auc: 0.832849
[780]	Train's auc: 0.857938	Evaluation's auc: 0.832866
[781]	Train's auc: 0.85796	Evaluation's auc: 0.83288
[782]	Train's auc: 0.857983	Evaluation's auc: 0.832881
[783]	Train's auc: 0.858004	Evaluation's auc: 0.832883
[784]	Train's auc: 0.858021	Evaluation's auc: 0.832882
[785]	Train's auc: 0.858047	Evaluation's auc: 0.8329
[786]	Train's auc: 0.858073	Evaluation's auc: 0.832897
[787]	Train's auc: 0.858091	Evaluation's auc: 0.832913
[788]	Train's auc: 0.858107	Evaluation's auc: 0.83291
[789]	Train's auc: 0.858132	Evaluation's auc: 0.832931
[790]	Train's auc: 0.858165	Evaluation's auc: 0.832955
[791]	Train's auc: 0.858187	Evaluation's auc: 0.832956
[792]	Train's auc: 0.858205	Evaluation's auc: 0.832973
[793]	Train's auc: 0.858225	Evaluation's auc: 0.832986
[794]	Train's auc: 0.858245	Evaluation's auc: 0.832981
[795]	Train's auc: 0.858274	Evaluation's auc: 0.833
[796]	Train's auc: 0.8583	Evaluation's auc: 0.833013
[797]	Train's auc: 0.858322	Evaluation's auc: 0.83304
[798]	Train's auc: 0.858346	Evaluation's auc: 0.833045
[799]	Train's auc: 0.858369	Evaluation's auc: 0.833065
[800]	Train's auc: 0.858389	Evaluation's auc: 0.833066
[801]	Train's auc: 0.858408	Evaluation's auc: 0.833068
[802]	Train's auc: 0.858433	Evaluation's auc: 0.833105
[803]	Train's auc: 0.858459	Evaluation's auc: 0.833114
[804]	Train's auc: 0.858489	Evaluation's auc: 0.83311
[805]	Train's auc: 0.858523	Evaluation's auc: 0.833134
[806]	Train's auc: 0.858553	Evaluation's auc: 0.833161
[807]	Train's auc: 0.858574	Evaluation's auc: 0.833154
[808]	Train's auc: 0.858607	Evaluation's auc: 0.833164
[809]	Train's auc: 0.858632	Evaluation's auc: 0.833158
[810]	Train's auc: 0.858651	Evaluation's auc: 0.833169
[811]	Train's auc: 0.85867	Evaluation's auc: 0.833166
[812]	Train's auc: 0.858694	Evaluation's auc: 0.833182
[813]	Train's auc: 0.858718	Evaluation's auc: 0.833192
[814]	Train's auc: 0.858741	Evaluation's auc: 0.833207
[815]	Train's auc: 0.858763	Evaluation's auc: 0.833203
[816]	Train's auc: 0.858791	Evaluation's auc: 0.833221
[817]	Train's auc: 0.858805	Evaluation's auc: 0.833224
[818]	Train's auc: 0.858826	Evaluation's auc: 0.83323
[819]	Train's auc: 0.858844	Evaluation's auc: 0.833238
[820]	Train's auc: 0.858863	Evaluation's auc: 0.83323
[821]	Train's auc: 0.858891	Evaluation's auc: 0.833256
[822]	Train's auc: 0.858905	Evaluation's auc: 0.83327
[823]	Train's auc: 0.858924	Evaluation's auc: 0.83327
[824]	Train's auc: 0.858941	Evaluation's auc: 0.833275
[825]	Train's auc: 0.858959	Evaluation's auc: 0.833293
[826]	Train's auc: 0.858975	Evaluation's auc: 0.833271
[827]	Train's auc: 0.859006	Evaluation's auc: 0.833287
[828]	Train's auc: 0.859029	Evaluation's auc: 0.833285
[829]	Train's auc: 0.85905	Evaluation's auc: 0.833298
[830]	Train's auc: 0.859077	Evaluation's auc: 0.833305
[831]	Train's auc: 0.859103	Evaluation's auc: 0.833321
[832]	Train's auc: 0.859124	Evaluation's auc: 0.833324
[833]	Train's auc: 0.859148	Evaluation's auc: 0.833318
[834]	Train's auc: 0.85918	Evaluation's auc: 0.833344
[835]	Train's auc: 0.859204	Evaluation's auc: 0.833345
[836]	Train's auc: 0.859225	Evaluation's auc: 0.833365
[837]	Train's auc: 0.859244	Evaluation's auc: 0.833372
[838]	Train's auc: 0.859257	Evaluation's auc: 0.833369
[839]	Train's auc: 0.859284	Evaluation's auc: 0.83338
[840]	Train's auc: 0.859306	Evaluation's auc: 0.833378
[841]	Train's auc: 0.859323	Evaluation's auc: 0.833398
[842]	Train's auc: 0.859347	Evaluation's auc: 0.833418
[843]	Train's auc: 0.859372	Evaluation's auc: 0.833428
[844]	Train's auc: 0.859397	Evaluation's auc: 0.833427
[845]	Train's auc: 0.859419	Evaluation's auc: 0.833437
[846]	Train's auc: 0.859456	Evaluation's auc: 0.833464
[847]	Train's auc: 0.859473	Evaluation's auc: 0.833475
[848]	Train's auc: 0.859502	Evaluation's auc: 0.833511
[849]	Train's auc: 0.859522	Evaluation's auc: 0.83351
[850]	Train's auc: 0.859538	Evaluation's auc: 0.833509
[851]	Train's auc: 0.859564	Evaluation's auc: 0.833508
[852]	Train's auc: 0.859595	Evaluation's auc: 0.833533
[853]	Train's auc: 0.859614	Evaluation's auc: 0.83353
[854]	Train's auc: 0.85963	Evaluation's auc: 0.833538
[855]	Train's auc: 0.859651	Evaluation's auc: 0.833553
[856]	Train's auc: 0.859678	Evaluation's auc: 0.833553
[857]	Train's auc: 0.859703	Evaluation's auc: 0.833561
[858]	Train's auc: 0.859719	Evaluation's auc: 0.833552
[859]	Train's auc: 0.859737	Evaluation's auc: 0.83355
[860]	Train's auc: 0.859764	Evaluation's auc: 0.83356
[861]	Train's auc: 0.859792	Evaluation's auc: 0.833584
[862]	Train's auc: 0.859809	Evaluation's auc: 0.833577
[863]	Train's auc: 0.859841	Evaluation's auc: 0.8336
[864]	Train's auc: 0.859863	Evaluation's auc: 0.833598
[865]	Train's auc: 0.85988	Evaluation's auc: 0.833603
[866]	Train's auc: 0.859899	Evaluation's auc: 0.83361
[867]	Train's auc: 0.859922	Evaluation's auc: 0.833624
[868]	Train's auc: 0.859943	Evaluation's auc: 0.833619
[869]	Train's auc: 0.859966	Evaluation's auc: 0.833629
[870]	Train's auc: 0.859991	Evaluation's auc: 0.833626
[871]	Train's auc: 0.860005	Evaluation's auc: 0.833633
[872]	Train's auc: 0.860031	Evaluation's auc: 0.833651
[873]	Train's auc: 0.860045	Evaluation's auc: 0.83366
[874]	Train's auc: 0.860072	Evaluation's auc: 0.833685
[875]	Train's auc: 0.860087	Evaluation's auc: 0.833695
[876]	Train's auc: 0.860111	Evaluation's auc: 0.83369
[877]	Train's auc: 0.86013	Evaluation's auc: 0.833707
[878]	Train's auc: 0.860154	Evaluation's auc: 0.833707
[879]	Train's auc: 0.860174	Evaluation's auc: 0.833721
[880]	Train's auc: 0.860186	Evaluation's auc: 0.833717
[881]	Train's auc: 0.8602	Evaluation's auc: 0.833718
[882]	Train's auc: 0.860225	Evaluation's auc: 0.833734
[883]	Train's auc: 0.860237	Evaluation's auc: 0.83375
[884]	Train's auc: 0.860259	Evaluation's auc: 0.833769
[885]	Train's auc: 0.860282	Evaluation's auc: 0.833775
[886]	Train's auc: 0.860303	Evaluation's auc: 0.833779
[887]	Train's auc: 0.860318	Evaluation's auc: 0.833773
[888]	Train's auc: 0.860336	Evaluation's auc: 0.833768
[889]	Train's auc: 0.860369	Evaluation's auc: 0.833787
[890]	Train's auc: 0.860386	Evaluation's auc: 0.833799
[891]	Train's auc: 0.860402	Evaluation's auc: 0.833801
[892]	Train's auc: 0.860415	Evaluation's auc: 0.833794
[893]	Train's auc: 0.860435	Evaluation's auc: 0.833803
[894]	Train's auc: 0.860461	Evaluation's auc: 0.833814
[895]	Train's auc: 0.860492	Evaluation's auc: 0.833824
[896]	Train's auc: 0.860515	Evaluation's auc: 0.833843
[897]	Train's auc: 0.860533	Evaluation's auc: 0.833859
[898]	Train's auc: 0.860555	Evaluation's auc: 0.833857
[899]	Train's auc: 0.860573	Evaluation's auc: 0.833865
[900]	Train's auc: 0.860586	Evaluation's auc: 0.833864
[901]	Train's auc: 0.8606	Evaluation's auc: 0.833861
[902]	Train's auc: 0.860626	Evaluation's auc: 0.833873
[903]	Train's auc: 0.860646	Evaluation's auc: 0.833889
[904]	Train's auc: 0.860664	Evaluation's auc: 0.833883
[905]	Train's auc: 0.860682	Evaluation's auc: 0.833893
[906]	Train's auc: 0.860703	Evaluation's auc: 0.833921
[907]	Train's auc: 0.860727	Evaluation's auc: 0.833938
[908]	Train's auc: 0.860741	Evaluation's auc: 0.833933
[909]	Train's auc: 0.86076	Evaluation's auc: 0.833929
[910]	Train's auc: 0.860773	Evaluation's auc: 0.833927
[911]	Train's auc: 0.860803	Evaluation's auc: 0.833948
[912]	Train's auc: 0.86083	Evaluation's auc: 0.833958
[913]	Train's auc: 0.860849	Evaluation's auc: 0.833969
[914]	Train's auc: 0.860858	Evaluation's auc: 0.833963
[915]	Train's auc: 0.860878	Evaluation's auc: 0.833957
[916]	Train's auc: 0.8609	Evaluation's auc: 0.833958
[917]	Train's auc: 0.860919	Evaluation's auc: 0.833983
[918]	Train's auc: 0.86094	Evaluation's auc: 0.833988
[919]	Train's auc: 0.86097	Evaluation's auc: 0.833995
[920]	Train's auc: 0.860986	Evaluation's auc: 0.83401
[921]	Train's auc: 0.861	Evaluation's auc: 0.834017
[922]	Train's auc: 0.861025	Evaluation's auc: 0.834025
[923]	Train's auc: 0.861046	Evaluation's auc: 0.834033
[924]	Train's auc: 0.861068	Evaluation's auc: 0.834043
[925]	Train's auc: 0.861094	Evaluation's auc: 0.834051
[926]	Train's auc: 0.861111	Evaluation's auc: 0.834049
[927]	Train's auc: 0.861133	Evaluation's auc: 0.834055
[928]	Train's auc: 0.861149	Evaluation's auc: 0.834048
[929]	Train's auc: 0.861173	Evaluation's auc: 0.834043
[930]	Train's auc: 0.861193	Evaluation's auc: 0.834059
[931]	Train's auc: 0.86122	Evaluation's auc: 0.834087
[932]	Train's auc: 0.86124	Evaluation's auc: 0.834107
[933]	Train's auc: 0.86126	Evaluation's auc: 0.83412
[934]	Train's auc: 0.861284	Evaluation's auc: 0.834133
[935]	Train's auc: 0.861298	Evaluation's auc: 0.834128
[936]	Train's auc: 0.861314	Evaluation's auc: 0.834119
[937]	Train's auc: 0.861337	Evaluation's auc: 0.834127
[938]	Train's auc: 0.861357	Evaluation's auc: 0.834147
[939]	Train's auc: 0.861375	Evaluation's auc: 0.834156
[940]	Train's auc: 0.861402	Evaluation's auc: 0.834172
[941]	Train's auc: 0.861415	Evaluation's auc: 0.834168
[942]	Train's auc: 0.861437	Evaluation's auc: 0.834171
[943]	Train's auc: 0.861456	Evaluation's auc: 0.834169
[944]	Train's auc: 0.861485	Evaluation's auc: 0.834173
[945]	Train's auc: 0.861509	Evaluation's auc: 0.834178
[946]	Train's auc: 0.861528	Evaluation's auc: 0.834177
[947]	Train's auc: 0.861543	Evaluation's auc: 0.834183
[948]	Train's auc: 0.861555	Evaluation's auc: 0.834188
[949]	Train's auc: 0.861573	Evaluation's auc: 0.834197
[950]	Train's auc: 0.861582	Evaluation's auc: 0.834199
[951]	Train's auc: 0.861606	Evaluation's auc: 0.83422
[952]	Train's auc: 0.861626	Evaluation's auc: 0.834215
[953]	Train's auc: 0.861639	Evaluation's auc: 0.83423
[954]	Train's auc: 0.861656	Evaluation's auc: 0.834228
[955]	Train's auc: 0.861677	Evaluation's auc: 0.834235
[956]	Train's auc: 0.861704	Evaluation's auc: 0.834252
[957]	Train's auc: 0.861725	Evaluation's auc: 0.834251
[958]	Train's auc: 0.861744	Evaluation's auc: 0.834244
[959]	Train's auc: 0.861776	Evaluation's auc: 0.834237
[960]	Train's auc: 0.861798	Evaluation's auc: 0.834246
[961]	Train's auc: 0.861815	Evaluation's auc: 0.834243
[962]	Train's auc: 0.861837	Evaluation's auc: 0.834249
[963]	Train's auc: 0.861854	Evaluation's auc: 0.834259
[964]	Train's auc: 0.86187	Evaluation's auc: 0.834265
[965]	Train's auc: 0.86188	Evaluation's auc: 0.834264
[966]	Train's auc: 0.861898	Evaluation's auc: 0.834262
[967]	Train's auc: 0.861914	Evaluation's auc: 0.834274
[968]	Train's auc: 0.861944	Evaluation's auc: 0.83427
[969]	Train's auc: 0.86196	Evaluation's auc: 0.834262
[970]	Train's auc: 0.861975	Evaluation's auc: 0.83426
[971]	Train's auc: 0.861997	Evaluation's auc: 0.834273
[972]	Train's auc: 0.86202	Evaluation's auc: 0.834282
[973]	Train's auc: 0.862036	Evaluation's auc: 0.834276
[974]	Train's auc: 0.862053	Evaluation's auc: 0.834287
[975]	Train's auc: 0.862077	Evaluation's auc: 0.834302
[976]	Train's auc: 0.862086	Evaluation's auc: 0.834297
[977]	Train's auc: 0.862111	Evaluation's auc: 0.83431
[978]	Train's auc: 0.862134	Evaluation's auc: 0.834311
[979]	Train's auc: 0.862158	Evaluation's auc: 0.834314
[980]	Train's auc: 0.862171	Evaluation's auc: 0.834324
[981]	Train's auc: 0.862191	Evaluation's auc: 0.83432
[982]	Train's auc: 0.86222	Evaluation's auc: 0.834332
[983]	Train's auc: 0.862245	Evaluation's auc: 0.834332
[984]	Train's auc: 0.862267	Evaluation's auc: 0.834346
[985]	Train's auc: 0.86228	Evaluation's auc: 0.83434
[986]	Train's auc: 0.862303	Evaluation's auc: 0.834347
[987]	Train's auc: 0.862321	Evaluation's auc: 0.834352
[988]	Train's auc: 0.862337	Evaluation's auc: 0.834356
[989]	Train's auc: 0.862356	Evaluation's auc: 0.834367
[990]	Train's auc: 0.862371	Evaluation's auc: 0.83437
[991]	Train's auc: 0.862389	Evaluation's auc: 0.834387
[992]	Train's auc: 0.862396	Evaluation's auc: 0.834373
[993]	Train's auc: 0.862409	Evaluation's auc: 0.834363
[994]	Train's auc: 0.862434	Evaluation's auc: 0.834371
[995]	Train's auc: 0.862457	Evaluation's auc: 0.834385
[996]	Train's auc: 0.862479	Evaluation's auc: 0.834398
[997]	Train's auc: 0.862501	Evaluation's auc: 0.834404
[998]	Train's auc: 0.862515	Evaluation's auc: 0.834419
[999]	Train's auc: 0.862537	Evaluation's auc: 0.834421
[1000]	Train's auc: 0.862561	Evaluation's auc: 0.834431
[1001]	Train's auc: 0.862571	Evaluation's auc: 0.834431
[1002]	Train's auc: 0.862594	Evaluation's auc: 0.834436
[1003]	Train's auc: 0.862611	Evaluation's auc: 0.834434
[1004]	Train's auc: 0.862627	Evaluation's auc: 0.83444
[1005]	Train's auc: 0.862648	Evaluation's auc: 0.834461
[1006]	Train's auc: 0.862673	Evaluation's auc: 0.834472
[1007]	Train's auc: 0.862695	Evaluation's auc: 0.834474
[1008]	Train's auc: 0.862719	Evaluation's auc: 0.834476
[1009]	Train's auc: 0.862744	Evaluation's auc: 0.834495
[1010]	Train's auc: 0.862761	Evaluation's auc: 0.8345
[1011]	Train's auc: 0.862783	Evaluation's auc: 0.83451
[1012]	Train's auc: 0.862796	Evaluation's auc: 0.834513
[1013]	Train's auc: 0.862806	Evaluation's auc: 0.834509
[1014]	Train's auc: 0.862823	Evaluation's auc: 0.83452
[1015]	Train's auc: 0.86285	Evaluation's auc: 0.834523
[1016]	Train's auc: 0.862865	Evaluation's auc: 0.834536
[1017]	Train's auc: 0.86289	Evaluation's auc: 0.834544
[1018]	Train's auc: 0.862903	Evaluation's auc: 0.834542
[1019]	Train's auc: 0.862919	Evaluation's auc: 0.834549
[1020]	Train's auc: 0.862932	Evaluation's auc: 0.834563
[1021]	Train's auc: 0.862951	Evaluation's auc: 0.834552
[1022]	Train's auc: 0.862966	Evaluation's auc: 0.834548
[1023]	Train's auc: 0.862985	Evaluation's auc: 0.834553
[1024]	Train's auc: 0.863009	Evaluation's auc: 0.834563
[1025]	Train's auc: 0.863019	Evaluation's auc: 0.834555
[1026]	Train's auc: 0.863039	Evaluation's auc: 0.83455
[1027]	Train's auc: 0.863058	Evaluation's auc: 0.834553
[1028]	Train's auc: 0.863084	Evaluation's auc: 0.834571
[1029]	Train's auc: 0.863105	Evaluation's auc: 0.83457
[1030]	Train's auc: 0.863131	Evaluation's auc: 0.834581
[1031]	Train's auc: 0.863149	Evaluation's auc: 0.834577
[1032]	Train's auc: 0.863165	Evaluation's auc: 0.834597
[1033]	Train's auc: 0.863187	Evaluation's auc: 0.834609
[1034]	Train's auc: 0.863197	Evaluation's auc: 0.834613
[1035]	Train's auc: 0.863215	Evaluation's auc: 0.834614
[1036]	Train's auc: 0.86324	Evaluation's auc: 0.834621
[1037]	Train's auc: 0.863261	Evaluation's auc: 0.834628
[1038]	Train's auc: 0.863286	Evaluation's auc: 0.834628
[1039]	Train's auc: 0.863294	Evaluation's auc: 0.834621
[1040]	Train's auc: 0.863308	Evaluation's auc: 0.834626
[1041]	Train's auc: 0.863324	Evaluation's auc: 0.83462
[1042]	Train's auc: 0.863335	Evaluation's auc: 0.83463
[1043]	Train's auc: 0.863356	Evaluation's auc: 0.834635
[1044]	Train's auc: 0.86337	Evaluation's auc: 0.834632
[1045]	Train's auc: 0.863386	Evaluation's auc: 0.834635
[1046]	Train's auc: 0.863404	Evaluation's auc: 0.834656
[1047]	Train's auc: 0.86342	Evaluation's auc: 0.834652
[1048]	Train's auc: 0.863431	Evaluation's auc: 0.834657
[1049]	Train's auc: 0.863452	Evaluation's auc: 0.834668
[1050]	Train's auc: 0.863472	Evaluation's auc: 0.834672
[1051]	Train's auc: 0.863493	Evaluation's auc: 0.83467
[1052]	Train's auc: 0.86352	Evaluation's auc: 0.834693
[1053]	Train's auc: 0.863538	Evaluation's auc: 0.834691
[1054]	Train's auc: 0.863556	Evaluation's auc: 0.83471
[1055]	Train's auc: 0.863571	Evaluation's auc: 0.834696
[1056]	Train's auc: 0.863596	Evaluation's auc: 0.834717
[1057]	Train's auc: 0.863608	Evaluation's auc: 0.834711
[1058]	Train's auc: 0.863636	Evaluation's auc: 0.834712
[1059]	Train's auc: 0.863657	Evaluation's auc: 0.834722
[1060]	Train's auc: 0.863668	Evaluation's auc: 0.834729
[1061]	Train's auc: 0.863683	Evaluation's auc: 0.834733
[1062]	Train's auc: 0.863697	Evaluation's auc: 0.834743
[1063]	Train's auc: 0.863704	Evaluation's auc: 0.834758
[1064]	Train's auc: 0.863718	Evaluation's auc: 0.834749
[1065]	Train's auc: 0.863737	Evaluation's auc: 0.834756
[1066]	Train's auc: 0.863758	Evaluation's auc: 0.834788
[1067]	Train's auc: 0.863779	Evaluation's auc: 0.834788
[1068]	Train's auc: 0.863796	Evaluation's auc: 0.834808
[1069]	Train's auc: 0.863807	Evaluation's auc: 0.834816
[1070]	Train's auc: 0.86383	Evaluation's auc: 0.834826
[1071]	Train's auc: 0.863848	Evaluation's auc: 0.834835
[1072]	Train's auc: 0.863864	Evaluation's auc: 0.834844
[1073]	Train's auc: 0.863887	Evaluation's auc: 0.834859
[1074]	Train's auc: 0.863904	Evaluation's auc: 0.83487
[1075]	Train's auc: 0.863922	Evaluation's auc: 0.834882
[1076]	Train's auc: 0.863939	Evaluation's auc: 0.83489
[1077]	Train's auc: 0.863953	Evaluation's auc: 0.834906
[1078]	Train's auc: 0.863971	Evaluation's auc: 0.834925
[1079]	Train's auc: 0.863988	Evaluation's auc: 0.834922
[1080]	Train's auc: 0.864002	Evaluation's auc: 0.83492
[1081]	Train's auc: 0.864023	Evaluation's auc: 0.834922
[1082]	Train's auc: 0.864043	Evaluation's auc: 0.834938
[1083]	Train's auc: 0.864056	Evaluation's auc: 0.834931
[1084]	Train's auc: 0.864069	Evaluation's auc: 0.834928
[1085]	Train's auc: 0.864076	Evaluation's auc: 0.834925
[1086]	Train's auc: 0.864098	Evaluation's auc: 0.834922
[1087]	Train's auc: 0.864113	Evaluation's auc: 0.834921
[1088]	Train's auc: 0.86413	Evaluation's auc: 0.834927
[1089]	Train's auc: 0.864148	Evaluation's auc: 0.834924
[1090]	Train's auc: 0.864171	Evaluation's auc: 0.834924
[1091]	Train's auc: 0.864193	Evaluation's auc: 0.834931
[1092]	Train's auc: 0.864216	Evaluation's auc: 0.834946
[1093]	Train's auc: 0.864236	Evaluation's auc: 0.83495
[1094]	Train's auc: 0.864265	Evaluation's auc: 0.834965
[1095]	Train's auc: 0.864286	Evaluation's auc: 0.834975
[1096]	Train's auc: 0.864305	Evaluation's auc: 0.834982
[1097]	Train's auc: 0.864317	Evaluation's auc: 0.834979
[1098]	Train's auc: 0.864339	Evaluation's auc: 0.834985
[1099]	Train's auc: 0.864358	Evaluation's auc: 0.835004
[1100]	Train's auc: 0.864378	Evaluation's auc: 0.835016
[1101]	Train's auc: 0.864399	Evaluation's auc: 0.835033
[1102]	Train's auc: 0.864425	Evaluation's auc: 0.835039
[1103]	Train's auc: 0.864436	Evaluation's auc: 0.835042
[1104]	Train's auc: 0.864453	Evaluation's auc: 0.83504
[1105]	Train's auc: 0.864467	Evaluation's auc: 0.835036
[1106]	Train's auc: 0.86448	Evaluation's auc: 0.835047
[1107]	Train's auc: 0.864497	Evaluation's auc: 0.835048
[1108]	Train's auc: 0.864523	Evaluation's auc: 0.835053
[1109]	Train's auc: 0.864548	Evaluation's auc: 0.83507
[1110]	Train's auc: 0.864564	Evaluation's auc: 0.835074
[1111]	Train's auc: 0.864582	Evaluation's auc: 0.835084
[1112]	Train's auc: 0.864599	Evaluation's auc: 0.835088
[1113]	Train's auc: 0.864617	Evaluation's auc: 0.835093
[1114]	Train's auc: 0.864635	Evaluation's auc: 0.835091
[1115]	Train's auc: 0.86466	Evaluation's auc: 0.835092
[1116]	Train's auc: 0.864668	Evaluation's auc: 0.835093
[1117]	Train's auc: 0.864687	Evaluation's auc: 0.835106
[1118]	Train's auc: 0.864696	Evaluation's auc: 0.835102
[1119]	Train's auc: 0.864719	Evaluation's auc: 0.835117
[1120]	Train's auc: 0.864738	Evaluation's auc: 0.83514
[1121]	Train's auc: 0.864757	Evaluation's auc: 0.835164
[1122]	Train's auc: 0.864774	Evaluation's auc: 0.835163
[1123]	Train's auc: 0.864788	Evaluation's auc: 0.835173
[1124]	Train's auc: 0.864816	Evaluation's auc: 0.835181
[1125]	Train's auc: 0.86483	Evaluation's auc: 0.835177
[1126]	Train's auc: 0.864843	Evaluation's auc: 0.835174
[1127]	Train's auc: 0.864865	Evaluation's auc: 0.835173
[1128]	Train's auc: 0.86487	Evaluation's auc: 0.835166
[1129]	Train's auc: 0.864891	Evaluation's auc: 0.835172
[1130]	Train's auc: 0.864908	Evaluation's auc: 0.835167
[1131]	Train's auc: 0.864924	Evaluation's auc: 0.835177
[1132]	Train's auc: 0.864938	Evaluation's auc: 0.835177
[1133]	Train's auc: 0.864958	Evaluation's auc: 0.83519
[1134]	Train's auc: 0.864974	Evaluation's auc: 0.835198
[1135]	Train's auc: 0.864991	Evaluation's auc: 0.835208
[1136]	Train's auc: 0.865005	Evaluation's auc: 0.835214
[1137]	Train's auc: 0.865019	Evaluation's auc: 0.83522
[1138]	Train's auc: 0.865041	Evaluation's auc: 0.835221
[1139]	Train's auc: 0.865056	Evaluation's auc: 0.835234
[1140]	Train's auc: 0.865071	Evaluation's auc: 0.835233
[1141]	Train's auc: 0.865091	Evaluation's auc: 0.83525
[1142]	Train's auc: 0.865105	Evaluation's auc: 0.835255
[1143]	Train's auc: 0.865119	Evaluation's auc: 0.835252
[1144]	Train's auc: 0.865135	Evaluation's auc: 0.835257
[1145]	Train's auc: 0.865157	Evaluation's auc: 0.835261
[1146]	Train's auc: 0.865169	Evaluation's auc: 0.835266
[1147]	Train's auc: 0.86519	Evaluation's auc: 0.835276
[1148]	Train's auc: 0.865206	Evaluation's auc: 0.835284
[1149]	Train's auc: 0.865232	Evaluation's auc: 0.835289
[1150]	Train's auc: 0.865243	Evaluation's auc: 0.835294
[1151]	Train's auc: 0.86526	Evaluation's auc: 0.835308
[1152]	Train's auc: 0.865283	Evaluation's auc: 0.835309
[1153]	Train's auc: 0.865306	Evaluation's auc: 0.835329
[1154]	Train's auc: 0.865324	Evaluation's auc: 0.835334
[1155]	Train's auc: 0.865332	Evaluation's auc: 0.835328
[1156]	Train's auc: 0.865349	Evaluation's auc: 0.835334
[1157]	Train's auc: 0.865368	Evaluation's auc: 0.835355
[1158]	Train's auc: 0.865383	Evaluation's auc: 0.835364
[1159]	Train's auc: 0.865398	Evaluation's auc: 0.83536
[1160]	Train's auc: 0.865413	Evaluation's auc: 0.835354
[1161]	Train's auc: 0.865425	Evaluation's auc: 0.835366
[1162]	Train's auc: 0.865441	Evaluation's auc: 0.835373
[1163]	Train's auc: 0.865461	Evaluation's auc: 0.835377
[1164]	Train's auc: 0.865485	Evaluation's auc: 0.835377
[1165]	Train's auc: 0.865503	Evaluation's auc: 0.835391
[1166]	Train's auc: 0.865524	Evaluation's auc: 0.835404
[1167]	Train's auc: 0.865537	Evaluation's auc: 0.835392
[1168]	Train's auc: 0.865568	Evaluation's auc: 0.8354
[1169]	Train's auc: 0.865588	Evaluation's auc: 0.835413
[1170]	Train's auc: 0.865604	Evaluation's auc: 0.835407
[1171]	Train's auc: 0.865616	Evaluation's auc: 0.835403
[1172]	Train's auc: 0.86563	Evaluation's auc: 0.835411
[1173]	Train's auc: 0.865644	Evaluation's auc: 0.835412
[1174]	Train's auc: 0.865663	Evaluation's auc: 0.835411
[1175]	Train's auc: 0.865676	Evaluation's auc: 0.835412
[1176]	Train's auc: 0.865687	Evaluation's auc: 0.835411
[1177]	Train's auc: 0.865707	Evaluation's auc: 0.835415
[1178]	Train's auc: 0.865721	Evaluation's auc: 0.83542
[1179]	Train's auc: 0.865745	Evaluation's auc: 0.835427
[1180]	Train's auc: 0.865767	Evaluation's auc: 0.835439
[1181]	Train's auc: 0.865782	Evaluation's auc: 0.835445
[1182]	Train's auc: 0.865792	Evaluation's auc: 0.835454
[1183]	Train's auc: 0.865805	Evaluation's auc: 0.835453
[1184]	Train's auc: 0.865826	Evaluation's auc: 0.835465
[1185]	Train's auc: 0.865836	Evaluation's auc: 0.835459
[1186]	Train's auc: 0.865854	Evaluation's auc: 0.835464
[1187]	Train's auc: 0.865874	Evaluation's auc: 0.835459
[1188]	Train's auc: 0.86588	Evaluation's auc: 0.835459
[1189]	Train's auc: 0.865901	Evaluation's auc: 0.835462
[1190]	Train's auc: 0.865917	Evaluation's auc: 0.835458
[1191]	Train's auc: 0.865938	Evaluation's auc: 0.835453
[1192]	Train's auc: 0.865949	Evaluation's auc: 0.83546
[1193]	Train's auc: 0.86596	Evaluation's auc: 0.835464
[1194]	Train's auc: 0.865982	Evaluation's auc: 0.835473
[1195]	Train's auc: 0.866004	Evaluation's auc: 0.835476
[1196]	Train's auc: 0.86602	Evaluation's auc: 0.835474
[1197]	Train's auc: 0.866033	Evaluation's auc: 0.835465
[1198]	Train's auc: 0.866053	Evaluation's auc: 0.835465
[1199]	Train's auc: 0.866073	Evaluation's auc: 0.83547
[1200]	Train's auc: 0.866085	Evaluation's auc: 0.835479
[1201]	Train's auc: 0.866099	Evaluation's auc: 0.835479
[1202]	Train's auc: 0.866115	Evaluation's auc: 0.835483
[1203]	Train's auc: 0.866133	Evaluation's auc: 0.835486
[1204]	Train's auc: 0.866145	Evaluation's auc: 0.83548
[1205]	Train's auc: 0.866163	Evaluation's auc: 0.835486
[1206]	Train's auc: 0.866184	Evaluation's auc: 0.835483
[1207]	Train's auc: 0.866205	Evaluation's auc: 0.8355
[1208]	Train's auc: 0.866217	Evaluation's auc: 0.835508
[1209]	Train's auc: 0.86623	Evaluation's auc: 0.835516
[1210]	Train's auc: 0.866251	Evaluation's auc: 0.83551
[1211]	Train's auc: 0.866272	Evaluation's auc: 0.835527
[1212]	Train's auc: 0.866286	Evaluation's auc: 0.835525
[1213]	Train's auc: 0.866308	Evaluation's auc: 0.835522
[1214]	Train's auc: 0.866321	Evaluation's auc: 0.835531
[1215]	Train's auc: 0.866342	Evaluation's auc: 0.83553
[1216]	Train's auc: 0.866363	Evaluation's auc: 0.835528
[1217]	Train's auc: 0.866384	Evaluation's auc: 0.835537
[1218]	Train's auc: 0.866396	Evaluation's auc: 0.835536
[1219]	Train's auc: 0.866405	Evaluation's auc: 0.835536
[1220]	Train's auc: 0.866421	Evaluation's auc: 0.83553
[1221]	Train's auc: 0.866444	Evaluation's auc: 0.835535
[1222]	Train's auc: 0.866465	Evaluation's auc: 0.835551
[1223]	Train's auc: 0.866482	Evaluation's auc: 0.835549
[1224]	Train's auc: 0.866492	Evaluation's auc: 0.835552
[1225]	Train's auc: 0.866508	Evaluation's auc: 0.835554
[1226]	Train's auc: 0.866526	Evaluation's auc: 0.835562
[1227]	Train's auc: 0.866545	Evaluation's auc: 0.835556
[1228]	Train's auc: 0.866564	Evaluation's auc: 0.835557
[1229]	Train's auc: 0.866579	Evaluation's auc: 0.835556
[1230]	Train's auc: 0.866596	Evaluation's auc: 0.835564
[1231]	Train's auc: 0.866614	Evaluation's auc: 0.835572
[1232]	Train's auc: 0.866632	Evaluation's auc: 0.835571
[1233]	Train's auc: 0.866645	Evaluation's auc: 0.835571
[1234]	Train's auc: 0.866658	Evaluation's auc: 0.835565
[1235]	Train's auc: 0.866678	Evaluation's auc: 0.835574
[1236]	Train's auc: 0.866694	Evaluation's auc: 0.835585
[1237]	Train's auc: 0.866713	Evaluation's auc: 0.835588
[1238]	Train's auc: 0.866724	Evaluation's auc: 0.835603
[1239]	Train's auc: 0.866733	Evaluation's auc: 0.835601
[1240]	Train's auc: 0.866755	Evaluation's auc: 0.835595
[1241]	Train's auc: 0.866772	Evaluation's auc: 0.835591
[1242]	Train's auc: 0.866788	Evaluation's auc: 0.835588
[1243]	Train's auc: 0.866804	Evaluation's auc: 0.835593
[1244]	Train's auc: 0.866819	Evaluation's auc: 0.835593
[1245]	Train's auc: 0.866836	Evaluation's auc: 0.835593
[1246]	Train's auc: 0.866853	Evaluation's auc: 0.835591
[1247]	Train's auc: 0.866861	Evaluation's auc: 0.835593
[1248]	Train's auc: 0.866877	Evaluation's auc: 0.835601
[1249]	Train's auc: 0.866891	Evaluation's auc: 0.835605
[1250]	Train's auc: 0.866902	Evaluation's auc: 0.835606
[1251]	Train's auc: 0.866921	Evaluation's auc: 0.835612
[1252]	Train's auc: 0.866944	Evaluation's auc: 0.835622
[1253]	Train's auc: 0.866959	Evaluation's auc: 0.835633
[1254]	Train's auc: 0.866978	Evaluation's auc: 0.835642
[1255]	Train's auc: 0.866995	Evaluation's auc: 0.835649
[1256]	Train's auc: 0.867011	Evaluation's auc: 0.835658
[1257]	Train's auc: 0.867026	Evaluation's auc: 0.835659
[1258]	Train's auc: 0.867046	Evaluation's auc: 0.835666
[1259]	Train's auc: 0.867056	Evaluation's auc: 0.835655
[1260]	Train's auc: 0.867075	Evaluation's auc: 0.835663
[1261]	Train's auc: 0.867085	Evaluation's auc: 0.835658
[1262]	Train's auc: 0.867104	Evaluation's auc: 0.835662
[1263]	Train's auc: 0.867119	Evaluation's auc: 0.835668
[1264]	Train's auc: 0.867132	Evaluation's auc: 0.83568
[1265]	Train's auc: 0.867152	Evaluation's auc: 0.835675
[1266]	Train's auc: 0.867165	Evaluation's auc: 0.835676
[1267]	Train's auc: 0.867186	Evaluation's auc: 0.835678
[1268]	Train's auc: 0.867199	Evaluation's auc: 0.835686
[1269]	Train's auc: 0.867207	Evaluation's auc: 0.835686
[1270]	Train's auc: 0.867227	Evaluation's auc: 0.835694
[1271]	Train's auc: 0.867248	Evaluation's auc: 0.835704
[1272]	Train's auc: 0.86726	Evaluation's auc: 0.835708
[1273]	Train's auc: 0.867278	Evaluation's auc: 0.835709
[1274]	Train's auc: 0.867288	Evaluation's auc: 0.835705
[1275]	Train's auc: 0.867303	Evaluation's auc: 0.835705
[1276]	Train's auc: 0.867322	Evaluation's auc: 0.835712
[1277]	Train's auc: 0.867342	Evaluation's auc: 0.835717
[1278]	Train's auc: 0.867355	Evaluation's auc: 0.835725
[1279]	Train's auc: 0.86737	Evaluation's auc: 0.835725
[1280]	Train's auc: 0.867389	Evaluation's auc: 0.835733
[1281]	Train's auc: 0.867402	Evaluation's auc: 0.835733
[1282]	Train's auc: 0.867427	Evaluation's auc: 0.835751
[1283]	Train's auc: 0.867445	Evaluation's auc: 0.835756
[1284]	Train's auc: 0.86746	Evaluation's auc: 0.835772
[1285]	Train's auc: 0.867474	Evaluation's auc: 0.835771
[1286]	Train's auc: 0.86749	Evaluation's auc: 0.835771
[1287]	Train's auc: 0.867509	Evaluation's auc: 0.835771
[1288]	Train's auc: 0.867528	Evaluation's auc: 0.835775
[1289]	Train's auc: 0.867545	Evaluation's auc: 0.835793
[1290]	Train's auc: 0.867559	Evaluation's auc: 0.835796
[1291]	Train's auc: 0.867576	Evaluation's auc: 0.835795
[1292]	Train's auc: 0.867586	Evaluation's auc: 0.835802
[1293]	Train's auc: 0.867609	Evaluation's auc: 0.835804
[1294]	Train's auc: 0.867617	Evaluation's auc: 0.835802
[1295]	Train's auc: 0.867635	Evaluation's auc: 0.835816
[1296]	Train's auc: 0.867655	Evaluation's auc: 0.835839
[1297]	Train's auc: 0.867677	Evaluation's auc: 0.835843
[1298]	Train's auc: 0.867694	Evaluation's auc: 0.835857
[1299]	Train's auc: 0.867711	Evaluation's auc: 0.835861
[1300]	Train's auc: 0.867732	Evaluation's auc: 0.835862
[1301]	Train's auc: 0.867742	Evaluation's auc: 0.835864
[1302]	Train's auc: 0.867764	Evaluation's auc: 0.835865
[1303]	Train's auc: 0.867777	Evaluation's auc: 0.835867
[1304]	Train's auc: 0.867791	Evaluation's auc: 0.835867
[1305]	Train's auc: 0.867805	Evaluation's auc: 0.83587
[1306]	Train's auc: 0.867814	Evaluation's auc: 0.835862
[1307]	Train's auc: 0.867836	Evaluation's auc: 0.835867
[1308]	Train's auc: 0.867859	Evaluation's auc: 0.835874
[1309]	Train's auc: 0.867878	Evaluation's auc: 0.835878
[1310]	Train's auc: 0.867897	Evaluation's auc: 0.835883
[1311]	Train's auc: 0.867916	Evaluation's auc: 0.835892
[1312]	Train's auc: 0.867933	Evaluation's auc: 0.835894
[1313]	Train's auc: 0.867943	Evaluation's auc: 0.835896
[1314]	Train's auc: 0.867962	Evaluation's auc: 0.835902
[1315]	Train's auc: 0.867981	Evaluation's auc: 0.835906
[1316]	Train's auc: 0.867999	Evaluation's auc: 0.835911
[1317]	Train's auc: 0.868017	Evaluation's auc: 0.835909
[1318]	Train's auc: 0.868031	Evaluation's auc: 0.835911
[1319]	Train's auc: 0.868054	Evaluation's auc: 0.835919
[1320]	Train's auc: 0.868076	Evaluation's auc: 0.835909
[1321]	Train's auc: 0.868089	Evaluation's auc: 0.835922
[1322]	Train's auc: 0.86811	Evaluation's auc: 0.835922
[1323]	Train's auc: 0.868127	Evaluation's auc: 0.835936
[1324]	Train's auc: 0.868145	Evaluation's auc: 0.835934
[1325]	Train's auc: 0.86816	Evaluation's auc: 0.835934
[1326]	Train's auc: 0.868183	Evaluation's auc: 0.835935
[1327]	Train's auc: 0.868195	Evaluation's auc: 0.835937
[1328]	Train's auc: 0.868212	Evaluation's auc: 0.835921
[1329]	Train's auc: 0.868223	Evaluation's auc: 0.835918
[1330]	Train's auc: 0.868237	Evaluation's auc: 0.835908
[1331]	Train's auc: 0.868245	Evaluation's auc: 0.835904
[1332]	Train's auc: 0.868258	Evaluation's auc: 0.835895
[1333]	Train's auc: 0.868278	Evaluation's auc: 0.835909
[1334]	Train's auc: 0.868293	Evaluation's auc: 0.835925
[1335]	Train's auc: 0.86831	Evaluation's auc: 0.835945
[1336]	Train's auc: 0.868332	Evaluation's auc: 0.835965
[1337]	Train's auc: 0.86835	Evaluation's auc: 0.835976
[1338]	Train's auc: 0.868365	Evaluation's auc: 0.835992
[1339]	Train's auc: 0.868375	Evaluation's auc: 0.835989
[1340]	Train's auc: 0.868392	Evaluation's auc: 0.83599
[1341]	Train's auc: 0.86841	Evaluation's auc: 0.836004
[1342]	Train's auc: 0.868419	Evaluation's auc: 0.836015
[1343]	Train's auc: 0.868432	Evaluation's auc: 0.836019
[1344]	Train's auc: 0.868446	Evaluation's auc: 0.836021
[1345]	Train's auc: 0.868466	Evaluation's auc: 0.836032
[1346]	Train's auc: 0.868484	Evaluation's auc: 0.836036
[1347]	Train's auc: 0.868495	Evaluation's auc: 0.836035
[1348]	Train's auc: 0.86851	Evaluation's auc: 0.836039
[1349]	Train's auc: 0.868533	Evaluation's auc: 0.836063
[1350]	Train's auc: 0.868546	Evaluation's auc: 0.836075
[1351]	Train's auc: 0.868562	Evaluation's auc: 0.836082
[1352]	Train's auc: 0.868574	Evaluation's auc: 0.836084
[1353]	Train's auc: 0.868593	Evaluation's auc: 0.836101
[1354]	Train's auc: 0.868603	Evaluation's auc: 0.836107
[1355]	Train's auc: 0.868618	Evaluation's auc: 0.83611
[1356]	Train's auc: 0.868638	Evaluation's auc: 0.836123
[1357]	Train's auc: 0.868654	Evaluation's auc: 0.836118
[1358]	Train's auc: 0.868672	Evaluation's auc: 0.836122
[1359]	Train's auc: 0.868686	Evaluation's auc: 0.836118
[1360]	Train's auc: 0.868708	Evaluation's auc: 0.836137
[1361]	Train's auc: 0.868712	Evaluation's auc: 0.836133
[1362]	Train's auc: 0.86873	Evaluation's auc: 0.836143
[1363]	Train's auc: 0.868748	Evaluation's auc: 0.836148
[1364]	Train's auc: 0.868761	Evaluation's auc: 0.836146
[1365]	Train's auc: 0.868774	Evaluation's auc: 0.836158
[1366]	Train's auc: 0.868794	Evaluation's auc: 0.836152
[1367]	Train's auc: 0.86881	Evaluation's auc: 0.836155
[1368]	Train's auc: 0.868823	Evaluation's auc: 0.836175
[1369]	Train's auc: 0.868826	Evaluation's auc: 0.836169
[1370]	Train's auc: 0.868845	Evaluation's auc: 0.836176
[1371]	Train's auc: 0.868858	Evaluation's auc: 0.836191
[1372]	Train's auc: 0.868877	Evaluation's auc: 0.836197
[1373]	Train's auc: 0.86889	Evaluation's auc: 0.836201
[1374]	Train's auc: 0.86891	Evaluation's auc: 0.836207
[1375]	Train's auc: 0.868923	Evaluation's auc: 0.836199
[1376]	Train's auc: 0.868943	Evaluation's auc: 0.836191
[1377]	Train's auc: 0.868958	Evaluation's auc: 0.836198
[1378]	Train's auc: 0.868969	Evaluation's auc: 0.836199
[1379]	Train's auc: 0.868986	Evaluation's auc: 0.836206
[1380]	Train's auc: 0.868994	Evaluation's auc: 0.836208
[1381]	Train's auc: 0.86901	Evaluation's auc: 0.836212
[1382]	Train's auc: 0.869021	Evaluation's auc: 0.836206
[1383]	Train's auc: 0.869039	Evaluation's auc: 0.836224
[1384]	Train's auc: 0.869054	Evaluation's auc: 0.836234
[1385]	Train's auc: 0.869065	Evaluation's auc: 0.836238
[1386]	Train's auc: 0.869073	Evaluation's auc: 0.836241
[1387]	Train's auc: 0.869085	Evaluation's auc: 0.836249
[1388]	Train's auc: 0.869095	Evaluation's auc: 0.836237
[1389]	Train's auc: 0.869115	Evaluation's auc: 0.836243
[1390]	Train's auc: 0.869129	Evaluation's auc: 0.836257
[1391]	Train's auc: 0.869149	Evaluation's auc: 0.836277
[1392]	Train's auc: 0.869164	Evaluation's auc: 0.836268
[1393]	Train's auc: 0.869181	Evaluation's auc: 0.836271
[1394]	Train's auc: 0.8692	Evaluation's auc: 0.836292
[1395]	Train's auc: 0.869216	Evaluation's auc: 0.836308
[1396]	Train's auc: 0.869231	Evaluation's auc: 0.836315
[1397]	Train's auc: 0.869247	Evaluation's auc: 0.836322
[1398]	Train's auc: 0.869261	Evaluation's auc: 0.836332
[1399]	Train's auc: 0.869276	Evaluation's auc: 0.836322
[1400]	Train's auc: 0.869288	Evaluation's auc: 0.836321
[1401]	Train's auc: 0.869304	Evaluation's auc: 0.83632
[1402]	Train's auc: 0.869321	Evaluation's auc: 0.836335
[1403]	Train's auc: 0.869338	Evaluation's auc: 0.836356
[1404]	Train's auc: 0.869354	Evaluation's auc: 0.836354
[1405]	Train's auc: 0.86937	Evaluation's auc: 0.836355
[1406]	Train's auc: 0.869385	Evaluation's auc: 0.836361
[1407]	Train's auc: 0.869398	Evaluation's auc: 0.836363
[1408]	Train's auc: 0.869417	Evaluation's auc: 0.836358
[1409]	Train's auc: 0.869435	Evaluation's auc: 0.836365
[1410]	Train's auc: 0.869457	Evaluation's auc: 0.836374
[1411]	Train's auc: 0.86947	Evaluation's auc: 0.836366
[1412]	Train's auc: 0.869484	Evaluation's auc: 0.836372
[1413]	Train's auc: 0.869499	Evaluation's auc: 0.836389
[1414]	Train's auc: 0.86951	Evaluation's auc: 0.8364
[1415]	Train's auc: 0.869531	Evaluation's auc: 0.836389
[1416]	Train's auc: 0.869546	Evaluation's auc: 0.836398
[1417]	Train's auc: 0.869562	Evaluation's auc: 0.836397
[1418]	Train's auc: 0.869579	Evaluation's auc: 0.836408
[1419]	Train's auc: 0.869597	Evaluation's auc: 0.836413
[1420]	Train's auc: 0.869614	Evaluation's auc: 0.836424
[1421]	Train's auc: 0.869631	Evaluation's auc: 0.836445
[1422]	Train's auc: 0.869645	Evaluation's auc: 0.836456
[1423]	Train's auc: 0.869659	Evaluation's auc: 0.836466
[1424]	Train's auc: 0.86967	Evaluation's auc: 0.836467
[1425]	Train's auc: 0.869687	Evaluation's auc: 0.836487
[1426]	Train's auc: 0.8697	Evaluation's auc: 0.836499
[1427]	Train's auc: 0.86971	Evaluation's auc: 0.836497
[1428]	Train's auc: 0.869724	Evaluation's auc: 0.836506
[1429]	Train's auc: 0.86974	Evaluation's auc: 0.83651
[1430]	Train's auc: 0.869756	Evaluation's auc: 0.836503
[1431]	Train's auc: 0.869765	Evaluation's auc: 0.836503
[1432]	Train's auc: 0.869782	Evaluation's auc: 0.836507
[1433]	Train's auc: 0.869796	Evaluation's auc: 0.836502
[1434]	Train's auc: 0.869815	Evaluation's auc: 0.836509
[1435]	Train's auc: 0.869828	Evaluation's auc: 0.836516
[1436]	Train's auc: 0.869838	Evaluation's auc: 0.836518
[1437]	Train's auc: 0.869854	Evaluation's auc: 0.836534
[1438]	Train's auc: 0.869865	Evaluation's auc: 0.836532
[1439]	Train's auc: 0.869881	Evaluation's auc: 0.836541
[1440]	Train's auc: 0.869891	Evaluation's auc: 0.836532
[1441]	Train's auc: 0.8699	Evaluation's auc: 0.836535
[1442]	Train's auc: 0.869917	Evaluation's auc: 0.836551
[1443]	Train's auc: 0.869931	Evaluation's auc: 0.836562
[1444]	Train's auc: 0.869947	Evaluation's auc: 0.836553
[1445]	Train's auc: 0.869961	Evaluation's auc: 0.836561
[1446]	Train's auc: 0.869973	Evaluation's auc: 0.836564
[1447]	Train's auc: 0.869986	Evaluation's auc: 0.836563
[1448]	Train's auc: 0.870001	Evaluation's auc: 0.836571
[1449]	Train's auc: 0.870012	Evaluation's auc: 0.836574
[1450]	Train's auc: 0.870029	Evaluation's auc: 0.836595
[1451]	Train's auc: 0.87004	Evaluation's auc: 0.836593
[1452]	Train's auc: 0.870055	Evaluation's auc: 0.836605
[1453]	Train's auc: 0.870068	Evaluation's auc: 0.83661
[1454]	Train's auc: 0.870084	Evaluation's auc: 0.836615
[1455]	Train's auc: 0.870098	Evaluation's auc: 0.836624
[1456]	Train's auc: 0.870107	Evaluation's auc: 0.836624
[1457]	Train's auc: 0.870127	Evaluation's auc: 0.836637
[1458]	Train's auc: 0.870139	Evaluation's auc: 0.836637
[1459]	Train's auc: 0.870148	Evaluation's auc: 0.836629
[1460]	Train's auc: 0.870167	Evaluation's auc: 0.83664
[1461]	Train's auc: 0.870181	Evaluation's auc: 0.836645
[1462]	Train's auc: 0.870192	Evaluation's auc: 0.836652
[1463]	Train's auc: 0.870208	Evaluation's auc: 0.836666
[1464]	Train's auc: 0.870221	Evaluation's auc: 0.836669
[1465]	Train's auc: 0.870233	Evaluation's auc: 0.836676
[1466]	Train's auc: 0.870249	Evaluation's auc: 0.836679
[1467]	Train's auc: 0.87026	Evaluation's auc: 0.836687
[1468]	Train's auc: 0.870281	Evaluation's auc: 0.83669
[1469]	Train's auc: 0.870293	Evaluation's auc: 0.836685
[1470]	Train's auc: 0.870308	Evaluation's auc: 0.836685
[1471]	Train's auc: 0.870321	Evaluation's auc: 0.836699
[1472]	Train's auc: 0.870333	Evaluation's auc: 0.836705
[1473]	Train's auc: 0.87034	Evaluation's auc: 0.836692
[1474]	Train's auc: 0.870355	Evaluation's auc: 0.8367
[1475]	Train's auc: 0.870371	Evaluation's auc: 0.836714
[1476]	Train's auc: 0.870387	Evaluation's auc: 0.836734
[1477]	Train's auc: 0.870406	Evaluation's auc: 0.836745
[1478]	Train's auc: 0.870417	Evaluation's auc: 0.836753
[1479]	Train's auc: 0.87043	Evaluation's auc: 0.836757
[1480]	Train's auc: 0.870442	Evaluation's auc: 0.83678
[1481]	Train's auc: 0.870454	Evaluation's auc: 0.836785
[1482]	Train's auc: 0.870472	Evaluation's auc: 0.83678
[1483]	Train's auc: 0.870482	Evaluation's auc: 0.836788
[1484]	Train's auc: 0.870495	Evaluation's auc: 0.836786
[1485]	Train's auc: 0.870509	Evaluation's auc: 0.836791
[1486]	Train's auc: 0.870527	Evaluation's auc: 0.836817
[1487]	Train's auc: 0.870533	Evaluation's auc: 0.836825
[1488]	Train's auc: 0.87055	Evaluation's auc: 0.83685
[1489]	Train's auc: 0.870566	Evaluation's auc: 0.836849
[1490]	Train's auc: 0.870574	Evaluation's auc: 0.836858
[1491]	Train's auc: 0.87059	Evaluation's auc: 0.836862
[1492]	Train's auc: 0.870602	Evaluation's auc: 0.836877
[1493]	Train's auc: 0.87062	Evaluation's auc: 0.836882
[1494]	Train's auc: 0.870633	Evaluation's auc: 0.836888
[1495]	Train's auc: 0.870644	Evaluation's auc: 0.836896
[1496]	Train's auc: 0.870654	Evaluation's auc: 0.836903
[1497]	Train's auc: 0.870668	Evaluation's auc: 0.836912
[1498]	Train's auc: 0.870684	Evaluation's auc: 0.836916
[1499]	Train's auc: 0.870699	Evaluation's auc: 0.836939
[1500]	Train's auc: 0.87071	Evaluation's auc: 0.83695
[1501]	Train's auc: 0.870727	Evaluation's auc: 0.836952
[1502]	Train's auc: 0.870735	Evaluation's auc: 0.836955
[1503]	Train's auc: 0.870752	Evaluation's auc: 0.836962
[1504]	Train's auc: 0.870771	Evaluation's auc: 0.836973
[1505]	Train's auc: 0.870779	Evaluation's auc: 0.83697
[1506]	Train's auc: 0.870789	Evaluation's auc: 0.83697
[1507]	Train's auc: 0.870807	Evaluation's auc: 0.836956
[1508]	Train's auc: 0.870821	Evaluation's auc: 0.836957
[1509]	Train's auc: 0.870831	Evaluation's auc: 0.836943
[1510]	Train's auc: 0.870847	Evaluation's auc: 0.836949
[1511]	Train's auc: 0.870862	Evaluation's auc: 0.836955
[1512]	Train's auc: 0.870876	Evaluation's auc: 0.83696
[1513]	Train's auc: 0.870891	Evaluation's auc: 0.836963
[1514]	Train's auc: 0.870909	Evaluation's auc: 0.836957
[1515]	Train's auc: 0.870923	Evaluation's auc: 0.836963
[1516]	Train's auc: 0.870933	Evaluation's auc: 0.836966
[1517]	Train's auc: 0.870952	Evaluation's auc: 0.836969
[1518]	Train's auc: 0.870966	Evaluation's auc: 0.836979
[1519]	Train's auc: 0.870974	Evaluation's auc: 0.836982
[1520]	Train's auc: 0.87099	Evaluation's auc: 0.836979
[1521]	Train's auc: 0.871003	Evaluation's auc: 0.836975
[1522]	Train's auc: 0.871014	Evaluation's auc: 0.836974
[1523]	Train's auc: 0.87103	Evaluation's auc: 0.83697
[1524]	Train's auc: 0.871042	Evaluation's auc: 0.83697
[1525]	Train's auc: 0.871053	Evaluation's auc: 0.836977
[1526]	Train's auc: 0.871065	Evaluation's auc: 0.836979
[1527]	Train's auc: 0.871072	Evaluation's auc: 0.836987
[1528]	Train's auc: 0.87108	Evaluation's auc: 0.836999
[1529]	Train's auc: 0.871089	Evaluation's auc: 0.837001
[1530]	Train's auc: 0.871102	Evaluation's auc: 0.836989
[1531]	Train's auc: 0.871118	Evaluation's auc: 0.836996
[1532]	Train's auc: 0.87113	Evaluation's auc: 0.836996
[1533]	Train's auc: 0.871143	Evaluation's auc: 0.837009
[1534]	Train's auc: 0.871154	Evaluation's auc: 0.837009
[1535]	Train's auc: 0.87117	Evaluation's auc: 0.837012
[1536]	Train's auc: 0.871178	Evaluation's auc: 0.837003
[1537]	Train's auc: 0.871193	Evaluation's auc: 0.837016
[1538]	Train's auc: 0.871208	Evaluation's auc: 0.837022
[1539]	Train's auc: 0.871227	Evaluation's auc: 0.837012
[1540]	Train's auc: 0.871246	Evaluation's auc: 0.837021
[1541]	Train's auc: 0.871258	Evaluation's auc: 0.837023
[1542]	Train's auc: 0.871278	Evaluation's auc: 0.837016
[1543]	Train's auc: 0.871287	Evaluation's auc: 0.837016
[1544]	Train's auc: 0.871296	Evaluation's auc: 0.837017
[1545]	Train's auc: 0.871312	Evaluation's auc: 0.837021
[1546]	Train's auc: 0.871327	Evaluation's auc: 0.837029
[1547]	Train's auc: 0.871345	Evaluation's auc: 0.837037
[1548]	Train's auc: 0.87136	Evaluation's auc: 0.837038
[1549]	Train's auc: 0.871371	Evaluation's auc: 0.837047
[1550]	Train's auc: 0.871384	Evaluation's auc: 0.837051
[1551]	Train's auc: 0.871398	Evaluation's auc: 0.837061
[1552]	Train's auc: 0.871411	Evaluation's auc: 0.837051
[1553]	Train's auc: 0.871422	Evaluation's auc: 0.837068
[1554]	Train's auc: 0.871434	Evaluation's auc: 0.837062
[1555]	Train's auc: 0.871444	Evaluation's auc: 0.837067
[1556]	Train's auc: 0.871457	Evaluation's auc: 0.837065
[1557]	Train's auc: 0.871474	Evaluation's auc: 0.83707
[1558]	Train's auc: 0.871485	Evaluation's auc: 0.837066
[1559]	Train's auc: 0.871504	Evaluation's auc: 0.837056
[1560]	Train's auc: 0.871518	Evaluation's auc: 0.837054
[1561]	Train's auc: 0.871531	Evaluation's auc: 0.837056
[1562]	Train's auc: 0.871551	Evaluation's auc: 0.837059
[1563]	Train's auc: 0.871559	Evaluation's auc: 0.837051
[1564]	Train's auc: 0.871577	Evaluation's auc: 0.837043
[1565]	Train's auc: 0.871589	Evaluation's auc: 0.837039
[1566]	Train's auc: 0.871601	Evaluation's auc: 0.837039
[1567]	Train's auc: 0.87161	Evaluation's auc: 0.837037
[1568]	Train's auc: 0.871625	Evaluation's auc: 0.837041
[1569]	Train's auc: 0.871631	Evaluation's auc: 0.837037
[1570]	Train's auc: 0.871646	Evaluation's auc: 0.837031
[1571]	Train's auc: 0.871659	Evaluation's auc: 0.837038
[1572]	Train's auc: 0.871677	Evaluation's auc: 0.837034
[1573]	Train's auc: 0.871691	Evaluation's auc: 0.837038
[1574]	Train's auc: 0.8717	Evaluation's auc: 0.83704
[1575]	Train's auc: 0.871713	Evaluation's auc: 0.837047
[1576]	Train's auc: 0.871732	Evaluation's auc: 0.837037
[1577]	Train's auc: 0.871744	Evaluation's auc: 0.837032
Early stopping, best iteration is:
[1557]	Train's auc: 0.871474	Evaluation's auc: 0.83707
In [73]:
y_probs_train = gbm_rh.predict(x_train)
y_probs_val = gbm_rh.predict(x_val)
In [74]:
# We can visualize these ROC curves with matplotlib
import matplotlib.pyplot as plt
from sklearn.metrics import confusion_matrix, roc_curve, roc_auc_score

plt.plot(roc_curve(y_train, y_probs_train)[0],roc_curve(y_train, y_probs_train)[1], 
         color = 'blue', label='Train ROC Curve (area = %0.2f)' % roc_auc_score(y_train, y_probs_train))

plt.plot(roc_curve(y_val, y_probs_val)[0],roc_curve(y_val, y_probs_val)[1], 
         color = 'red', label='Test ROC Curve (area = %0.2f)' % roc_auc_score(y_val, y_probs_val))


plt.plot([0, 1], [0, 1], color='black', linestyle='--')
plt.xlim([0.0, 1.0])
plt.ylim([0.0, 1.05])
plt.title('AUC-ROC Curve')
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.legend()
plt.show()

Hyperband on LightGBM

In [75]:
import hyperband
from hyperband import HyperbandSearchCV
In [76]:
# Set paramters for the model
band_params = {'boosting_type': 'gbdt',
 'class_weight': None,
 'colsample_bytree': 0.9,
 'importance_type': 'split',
 'learning_rate': 0.01,
 'max_depth': 20,
 'min_child_samples': 25,
 'min_split_gain': 0,
 'n_estimators': 4000,
 'n_jobs': -1,
 'num_leaves': 200,
 'objective': 'binary',
 'random_state': None,
 'reg_alpha': 0,
 'reg_lambda': 0,
 'silent': True,
 #'subsample': 0.8,
 'subsample_for_bin': 200000,
 'subsample_freq': 1,
 'metric': 'auc',
 'max_bin': 100,
 'verbose': -1,
 'scale_pos_weight': 1}
In [77]:
param_dict =  {
    'learning_rate': [.001,0.01,0.1],
    'max_depth': [5,15, 20, 30, 50],
    'num_leaves': [50,150,200,250],
    'min_child_samples': [35,40,60,80],
    'subsample': [0.7,0.8,0.9],
    'min_child_weight' : [0, 0.5, 2, 3, 5]
}
In [78]:
search = HyperbandSearchCV(lgb.LGBMClassifier(** band_params),param_dict,cv=3,
                           resource_param='n_estimators',verbose=100,
                           max_iter=200,min_iter=50,
                           scoring='roc_auc')
In [79]:
search.fit(x_train,y_train)
Starting bracket 1 (out of 2) of hyperband
Starting successive halving iteration 1 out of 2. Fitting 3 configurations, with resource_param n_estimators set to 66, and keeping the best 1 configurations.
Fitting 3 folds for each of 3 candidates, totalling 9 fits
[Parallel(n_jobs=1)]: Using backend SequentialBackend with 1 concurrent workers.
[CV] subsample=0.7, num_leaves=50, min_child_weight=0.5, min_child_samples=80, max_depth=5, learning_rate=0.001, n_estimators=66 
[CV]  subsample=0.7, num_leaves=50, min_child_weight=0.5, min_child_samples=80, max_depth=5, learning_rate=0.001, n_estimators=66, score=0.792, total=   1.3s
[Parallel(n_jobs=1)]: Done   1 out of   1 | elapsed:    1.3s remaining:    0.0s
[CV] subsample=0.7, num_leaves=50, min_child_weight=0.5, min_child_samples=80, max_depth=5, learning_rate=0.001, n_estimators=66 
[CV]  subsample=0.7, num_leaves=50, min_child_weight=0.5, min_child_samples=80, max_depth=5, learning_rate=0.001, n_estimators=66, score=0.797, total=   1.3s
[Parallel(n_jobs=1)]: Done   2 out of   2 | elapsed:    2.6s remaining:    0.0s
[CV] subsample=0.7, num_leaves=50, min_child_weight=0.5, min_child_samples=80, max_depth=5, learning_rate=0.001, n_estimators=66 
[CV]  subsample=0.7, num_leaves=50, min_child_weight=0.5, min_child_samples=80, max_depth=5, learning_rate=0.001, n_estimators=66, score=0.796, total=   1.3s
[Parallel(n_jobs=1)]: Done   3 out of   3 | elapsed:    4.0s remaining:    0.0s
[CV] subsample=0.7, num_leaves=200, min_child_weight=0, min_child_samples=35, max_depth=5, learning_rate=0.01, n_estimators=66 
[CV]  subsample=0.7, num_leaves=200, min_child_weight=0, min_child_samples=35, max_depth=5, learning_rate=0.01, n_estimators=66, score=0.800, total=   1.4s
[Parallel(n_jobs=1)]: Done   4 out of   4 | elapsed:    5.3s remaining:    0.0s
[CV] subsample=0.7, num_leaves=200, min_child_weight=0, min_child_samples=35, max_depth=5, learning_rate=0.01, n_estimators=66 
[CV]  subsample=0.7, num_leaves=200, min_child_weight=0, min_child_samples=35, max_depth=5, learning_rate=0.01, n_estimators=66, score=0.806, total=   1.3s
[Parallel(n_jobs=1)]: Done   5 out of   5 | elapsed:    6.7s remaining:    0.0s
[CV] subsample=0.7, num_leaves=200, min_child_weight=0, min_child_samples=35, max_depth=5, learning_rate=0.01, n_estimators=66 
[CV]  subsample=0.7, num_leaves=200, min_child_weight=0, min_child_samples=35, max_depth=5, learning_rate=0.01, n_estimators=66, score=0.804, total=   1.5s
[Parallel(n_jobs=1)]: Done   6 out of   6 | elapsed:    8.2s remaining:    0.0s
[CV] subsample=0.7, num_leaves=150, min_child_weight=0.5, min_child_samples=60, max_depth=15, learning_rate=0.1, n_estimators=66 
[CV]  subsample=0.7, num_leaves=150, min_child_weight=0.5, min_child_samples=60, max_depth=15, learning_rate=0.1, n_estimators=66, score=0.832, total=   2.1s
[Parallel(n_jobs=1)]: Done   7 out of   7 | elapsed:   10.3s remaining:    0.0s
[CV] subsample=0.7, num_leaves=150, min_child_weight=0.5, min_child_samples=60, max_depth=15, learning_rate=0.1, n_estimators=66 
[CV]  subsample=0.7, num_leaves=150, min_child_weight=0.5, min_child_samples=60, max_depth=15, learning_rate=0.1, n_estimators=66, score=0.839, total=   2.0s
[Parallel(n_jobs=1)]: Done   8 out of   8 | elapsed:   12.3s remaining:    0.0s
[CV] subsample=0.7, num_leaves=150, min_child_weight=0.5, min_child_samples=60, max_depth=15, learning_rate=0.1, n_estimators=66 
[CV]  subsample=0.7, num_leaves=150, min_child_weight=0.5, min_child_samples=60, max_depth=15, learning_rate=0.1, n_estimators=66, score=0.833, total=   2.0s
[Parallel(n_jobs=1)]: Done   9 out of   9 | elapsed:   14.3s remaining:    0.0s
[Parallel(n_jobs=1)]: Done   9 out of   9 | elapsed:   14.3s finished
Starting successive halving iteration 2 out of 2. Fitting 1 configurations, with resource_param n_estimators set to 200
Fitting 3 folds for each of 1 candidates, totalling 3 fits
[Parallel(n_jobs=1)]: Using backend SequentialBackend with 1 concurrent workers.
[CV] subsample=0.7, num_leaves=150, min_child_weight=0.5, min_child_samples=60, max_depth=15, learning_rate=0.1, n_estimators=200 
/Users/piumallick/anaconda3/lib/python3.7/site-packages/sklearn/model_selection/_search.py:823: FutureWarning: The parameter 'iid' is deprecated in 0.22 and will be removed in 0.24.
  "removed in 0.24.", FutureWarning
[CV]  subsample=0.7, num_leaves=150, min_child_weight=0.5, min_child_samples=60, max_depth=15, learning_rate=0.1, n_estimators=200, score=0.827, total=   4.7s
[Parallel(n_jobs=1)]: Done   1 out of   1 | elapsed:    4.7s remaining:    0.0s
[CV] subsample=0.7, num_leaves=150, min_child_weight=0.5, min_child_samples=60, max_depth=15, learning_rate=0.1, n_estimators=200 
[CV]  subsample=0.7, num_leaves=150, min_child_weight=0.5, min_child_samples=60, max_depth=15, learning_rate=0.1, n_estimators=200, score=0.836, total=   4.5s
[Parallel(n_jobs=1)]: Done   2 out of   2 | elapsed:    9.2s remaining:    0.0s
[CV] subsample=0.7, num_leaves=150, min_child_weight=0.5, min_child_samples=60, max_depth=15, learning_rate=0.1, n_estimators=200 
[CV]  subsample=0.7, num_leaves=150, min_child_weight=0.5, min_child_samples=60, max_depth=15, learning_rate=0.1, n_estimators=200, score=0.829, total=   4.5s
[Parallel(n_jobs=1)]: Done   3 out of   3 | elapsed:   13.7s remaining:    0.0s
[Parallel(n_jobs=1)]: Done   3 out of   3 | elapsed:   13.7s finished
Starting bracket 2 (out of 2) of hyperband
Starting successive halving iteration 1 out of 1. Fitting 2 configurations, with resource_param n_estimators set to 200
Fitting 3 folds for each of 2 candidates, totalling 6 fits
[Parallel(n_jobs=1)]: Using backend SequentialBackend with 1 concurrent workers.
[CV] subsample=0.7, num_leaves=200, min_child_weight=0, min_child_samples=80, max_depth=5, learning_rate=0.001, n_estimators=200 
/Users/piumallick/anaconda3/lib/python3.7/site-packages/sklearn/model_selection/_search.py:823: FutureWarning: The parameter 'iid' is deprecated in 0.22 and will be removed in 0.24.
  "removed in 0.24.", FutureWarning
[CV]  subsample=0.7, num_leaves=200, min_child_weight=0, min_child_samples=80, max_depth=5, learning_rate=0.001, n_estimators=200, score=0.795, total=   2.7s
[Parallel(n_jobs=1)]: Done   1 out of   1 | elapsed:    2.7s remaining:    0.0s
[CV] subsample=0.7, num_leaves=200, min_child_weight=0, min_child_samples=80, max_depth=5, learning_rate=0.001, n_estimators=200 
[CV]  subsample=0.7, num_leaves=200, min_child_weight=0, min_child_samples=80, max_depth=5, learning_rate=0.001, n_estimators=200, score=0.799, total=   2.7s
[Parallel(n_jobs=1)]: Done   2 out of   2 | elapsed:    5.4s remaining:    0.0s
[CV] subsample=0.7, num_leaves=200, min_child_weight=0, min_child_samples=80, max_depth=5, learning_rate=0.001, n_estimators=200 
[CV]  subsample=0.7, num_leaves=200, min_child_weight=0, min_child_samples=80, max_depth=5, learning_rate=0.001, n_estimators=200, score=0.798, total=   2.7s
[Parallel(n_jobs=1)]: Done   3 out of   3 | elapsed:    8.0s remaining:    0.0s
[CV] subsample=0.8, num_leaves=50, min_child_weight=0.5, min_child_samples=35, max_depth=5, learning_rate=0.001, n_estimators=200 
[CV]  subsample=0.8, num_leaves=50, min_child_weight=0.5, min_child_samples=35, max_depth=5, learning_rate=0.001, n_estimators=200, score=0.794, total=   3.0s
[Parallel(n_jobs=1)]: Done   4 out of   4 | elapsed:   11.1s remaining:    0.0s
[CV] subsample=0.8, num_leaves=50, min_child_weight=0.5, min_child_samples=35, max_depth=5, learning_rate=0.001, n_estimators=200 
[CV]  subsample=0.8, num_leaves=50, min_child_weight=0.5, min_child_samples=35, max_depth=5, learning_rate=0.001, n_estimators=200, score=0.798, total=   2.9s
[Parallel(n_jobs=1)]: Done   5 out of   5 | elapsed:   14.0s remaining:    0.0s
[CV] subsample=0.8, num_leaves=50, min_child_weight=0.5, min_child_samples=35, max_depth=5, learning_rate=0.001, n_estimators=200 
[CV]  subsample=0.8, num_leaves=50, min_child_weight=0.5, min_child_samples=35, max_depth=5, learning_rate=0.001, n_estimators=200, score=0.798, total=   3.1s
[Parallel(n_jobs=1)]: Done   6 out of   6 | elapsed:   17.1s remaining:    0.0s
[Parallel(n_jobs=1)]: Done   6 out of   6 | elapsed:   17.1s finished
/Users/piumallick/anaconda3/lib/python3.7/site-packages/sklearn/model_selection/_search.py:823: FutureWarning: The parameter 'iid' is deprecated in 0.22 and will be removed in 0.24.
  "removed in 0.24.", FutureWarning
Out[79]:
HyperbandSearchCV(cv=3, error_score='raise',
                  estimator=LGBMClassifier(boosting_type='gbdt',
                                           class_weight=None,
                                           colsample_bytree=0.9,
                                           importance_type='split',
                                           learning_rate=0.01, max_bin=100,
                                           max_depth=20, metric='auc',
                                           min_child_samples=25,
                                           min_child_weight=0.001,
                                           min_split_gain=0, n_estimators=4000,
                                           n_jobs=-1, num_leaves=200,
                                           objective='binary',
                                           random_state=None,...
                  param_distributions={'learning_rate': [0.001, 0.01, 0.1],
                                       'max_depth': [5, 15, 20, 30, 50],
                                       'min_child_samples': [35, 40, 60, 80],
                                       'min_child_weight': [0, 0.5, 2, 3, 5],
                                       'num_leaves': [50, 150, 200, 250],
                                       'subsample': [0.7, 0.8, 0.9]},
                  pre_dispatch='2*n_jobs', random_state=None, refit=True,
                  resource_param='n_estimators', return_train_score=False,
                  scoring='roc_auc', skip_last=0, verbose=100)
In [80]:
search.best_params_
Out[80]:
{'subsample': 0.7,
 'num_leaves': 150,
 'min_child_weight': 0.5,
 'min_child_samples': 60,
 'max_depth': 15,
 'learning_rate': 0.1,
 'n_estimators': 66}
In [81]:
band_params = {'subsample': 0.9,
 'num_leaves': 200,
 'min_child_weight': 0,
 'min_child_samples': 80,
 'max_depth': 30,
 'learning_rate': 0.1,
 'n_estimators': 66}
In [82]:
search.cv_results_
Out[82]:
{'mean_fit_time': array([1.1687669 , 1.25619014, 1.8475283 , 4.286666  , 2.48585145,
        2.83708604]),
 'std_fit_time': array([0.02089219, 0.06786707, 0.04141315, 0.08947038, 0.00455839,
        0.09006264]),
 'mean_score_time': array([0.13999486, 0.1445628 , 0.17552074, 0.274158  , 0.18325957,
        0.17965094]),
 'std_score_time': array([0.00130434, 0.00329493, 0.00481657, 0.00373901, 0.00464876,
        0.00055194]),
 'param_subsample': masked_array(data=[0.7, 0.7, 0.7, 0.7, 0.7, 0.8],
              mask=[False, False, False, False, False, False],
        fill_value='?',
             dtype=object),
 'param_num_leaves': masked_array(data=[50, 200, 150, 150, 200, 50],
              mask=[False, False, False, False, False, False],
        fill_value='?',
             dtype=object),
 'param_min_child_weight': masked_array(data=[0.5, 0, 0.5, 0.5, 0, 0.5],
              mask=[False, False, False, False, False, False],
        fill_value='?',
             dtype=object),
 'param_min_child_samples': masked_array(data=[80, 35, 60, 60, 80, 35],
              mask=[False, False, False, False, False, False],
        fill_value='?',
             dtype=object),
 'param_max_depth': masked_array(data=[5, 5, 15, 15, 5, 5],
              mask=[False, False, False, False, False, False],
        fill_value='?',
             dtype=object),
 'param_learning_rate': masked_array(data=[0.001, 0.01, 0.1, 0.1, 0.001, 0.001],
              mask=[False, False, False, False, False, False],
        fill_value='?',
             dtype=object),
 'param_n_estimators': masked_array(data=[66, 66, 66, 200, 200, 200],
              mask=[False, False, False, False, False, False],
        fill_value='?',
             dtype=object),
 'params': [{'subsample': 0.7,
   'num_leaves': 50,
   'min_child_weight': 0.5,
   'min_child_samples': 80,
   'max_depth': 5,
   'learning_rate': 0.001,
   'n_estimators': 66},
  {'subsample': 0.7,
   'num_leaves': 200,
   'min_child_weight': 0,
   'min_child_samples': 35,
   'max_depth': 5,
   'learning_rate': 0.01,
   'n_estimators': 66},
  {'subsample': 0.7,
   'num_leaves': 150,
   'min_child_weight': 0.5,
   'min_child_samples': 60,
   'max_depth': 15,
   'learning_rate': 0.1,
   'n_estimators': 66},
  {'subsample': 0.7,
   'num_leaves': 150,
   'min_child_weight': 0.5,
   'min_child_samples': 60,
   'max_depth': 15,
   'learning_rate': 0.1,
   'n_estimators': 200},
  {'subsample': 0.7,
   'num_leaves': 200,
   'min_child_weight': 0,
   'min_child_samples': 80,
   'max_depth': 5,
   'learning_rate': 0.001,
   'n_estimators': 200},
  {'subsample': 0.8,
   'num_leaves': 50,
   'min_child_weight': 0.5,
   'min_child_samples': 35,
   'max_depth': 5,
   'learning_rate': 0.001,
   'n_estimators': 200}],
 'split0_test_score': array([0.79226019, 0.79973459, 0.83238271, 0.82717367, 0.79451763,
        0.79391491]),
 'split1_test_score': array([0.79718138, 0.80639349, 0.83885198, 0.83623924, 0.79928345,
        0.79823213]),
 'split2_test_score': array([0.79629067, 0.80356313, 0.83292964, 0.82879948, 0.79829625,
        0.79782653]),
 'mean_test_score': array([0.79524406, 0.8032304 , 0.83472148, 0.8307375 , 0.79736576,
        0.79665783]),
 'std_test_score': array([0.00214105, 0.00272867, 0.00292927, 0.00394658, 0.00205389,
        0.00194662]),
 'rank_test_score': array([6, 3, 1, 2, 4, 5], dtype=int32),
 'hyperband_bracket': array([1., 1., 1., 1., 2., 2.])}
In [83]:
## Predicting the data
search.score(x_train, y_train)
Out[83]:
0.9477880595150036
In [84]:
search.score(x_test, y_test)
Out[84]:
0.8375628226678985

Using Hyperband:

Post evaluation Train AUC: 0.9477880595150036

Post evaluation Tesy AUC: 0.8375628226678985

This is a huge overfit. Hence, we would stick with LightGBM (with manual hyperparameter tuning). However, we would explore the other hyperparameter tuning as well.

Hyperopt LightGBM

In [85]:
# Sets the space to search over and the prior probabilities over the search space 
lgbm_space = {
# hp.choice.choice will select 1 value from the given list  , 'dart', 'goss', 'rf'
    'boosting_type': hp.hp.choice('boosting_type',  ['gbdt']),
    'objective': hp.hp.choice('objective',  ['binary']),
    'num_leaves':hp.hp.choice('num_leaves', np.arange(10, 100,1, dtype=int)),
    'subsample':hp.hp.quniform('subsample',0.5,1.0,0.05),
    'colsample_bytree':hp.hp.quniform('colsample_bytree',0.5,1.0,0.05),
    'min_child_weight':hp.hp.quniform('min_child_weight', 100, 1000,100),
    'lambda_l1': hp.hp.uniform('lambda_l1', 0.0, 1000.0),
    'lambda_l2': hp.hp.uniform('lambda_l2', 0.0, 1000.0),
    'learning_rate': hp.hp.loguniform('learning_rate', -4, 0),
    'feature_fraction': hp.hp.loguniform('feature_fraction', -4, 0),
    'bagging_fraction': hp.hp.loguniform('bagging_fraction', -4, 0),
    'bagging_frequency':hp.hp.choice('bagging_frequency', np.arange(5, 100,1, dtype=int)),
    'drop_rate': hp.hp.loguniform('drop_rate', -4, 0),
    'scale_pos_weight': hp.hp.uniform('scale_pos_weight', 6.0, 10.0),
    'metric' : hp.hp.choice('metric', ['auc']),
    'max_depth': hp.hp.choice('max_depth', np.arange(1, 100,1, dtype=int))
    }
In [86]:
# Here we define an objective (loss) function I take 
def objective_m(params, n_folds=5):


    model = lgb.cv(params = params,
              train_set = lgb_proc_train_rh,
              num_boost_round = 10000,
              early_stopping_rounds = 10,
             nfold = n_folds)
  
    
  
    # returns the best average loss on validation set 
    #print(model)
    loss = 1-(max(model['auc-mean']))
    return loss


bayes_trials = Trials()
MAX_EVALS = 100 # this controls the runtime 

lgbm_best_m = fmin(fn = objective_m, space = lgbm_space, algo = hp.tpe.suggest, 
max_evals = MAX_EVALS, trials = bayes_trials)
100%|██████████| 100/100 [09:11<00:00,  5.51s/trial, best loss: 0.1574320283879065]
In [87]:
lgbm_best_m
Out[87]:
{'bagging_fraction': 0.04690582606565,
 'bagging_frequency': 27,
 'boosting_type': 0,
 'colsample_bytree': 0.6000000000000001,
 'drop_rate': 0.05448125188564363,
 'feature_fraction': 0.42721382915764394,
 'lambda_l1': 2.409364745534752,
 'lambda_l2': 210.22831642617797,
 'learning_rate': 0.31014578499166295,
 'max_depth': 74,
 'metric': 0,
 'min_child_weight': 100.0,
 'num_leaves': 77,
 'objective': 0,
 'scale_pos_weight': 8.258766413767718,
 'subsample': 0.9}
In [88]:
lgb_params2 = {'bagging_fraction': 0.4584000884522912,
 'bagging_frequency': 75,
 'boosting_type': 'gbdt',
 'metric': 'auc',
 'colsample_bytree': 0.8,
 'drop_rate': 0.28657769218383206,
 'feature_fraction': 0.06676229703625178,
 'lambda_l1': 3.7177635273122966,
 'lambda_l2': 694.9058177271635,
 'learning_rate': 0.1504837141507504,
 'max_depth': 40,
 'min_child_weight': 100.0,
 'num_leaves': 51,
 'scale_pos_weight': 7.4503069212775985,
 'subsample': 0.9
 }
In [89]:
gbm_rh2 = lgb.train(params = lgb_params2, train_set = lgb_proc_train_rh,
                num_boost_round = 5000, valid_sets = [lgb_proc_val_rh, lgb_proc_train_rh],
               valid_names = ['Evaluation', 'Train'])
[1]	Train's auc: 0.636951	Evaluation's auc: 0.629649
[2]	Train's auc: 0.703514	Evaluation's auc: 0.689602
[3]	Train's auc: 0.724583	Evaluation's auc: 0.712357
[4]	Train's auc: 0.7584	Evaluation's auc: 0.743413
[5]	Train's auc: 0.758036	Evaluation's auc: 0.742912
[6]	Train's auc: 0.76326	Evaluation's auc: 0.749456
[7]	Train's auc: 0.794212	Evaluation's auc: 0.782233
[8]	Train's auc: 0.799726	Evaluation's auc: 0.787347
[9]	Train's auc: 0.800631	Evaluation's auc: 0.787874
[10]	Train's auc: 0.802235	Evaluation's auc: 0.790893
[11]	Train's auc: 0.801542	Evaluation's auc: 0.789616
[12]	Train's auc: 0.800436	Evaluation's auc: 0.788491
[13]	Train's auc: 0.800616	Evaluation's auc: 0.787924
[14]	Train's auc: 0.815324	Evaluation's auc: 0.802295
[15]	Train's auc: 0.814743	Evaluation's auc: 0.801433
[16]	Train's auc: 0.814185	Evaluation's auc: 0.800664
[17]	Train's auc: 0.813833	Evaluation's auc: 0.800166
[18]	Train's auc: 0.813344	Evaluation's auc: 0.799011
[19]	Train's auc: 0.813218	Evaluation's auc: 0.798748
[20]	Train's auc: 0.812069	Evaluation's auc: 0.797858
[21]	Train's auc: 0.812655	Evaluation's auc: 0.797689
[22]	Train's auc: 0.817537	Evaluation's auc: 0.802173
[23]	Train's auc: 0.824544	Evaluation's auc: 0.809298
[24]	Train's auc: 0.829867	Evaluation's auc: 0.814076
[25]	Train's auc: 0.833839	Evaluation's auc: 0.817741
[26]	Train's auc: 0.834272	Evaluation's auc: 0.818175
[27]	Train's auc: 0.834381	Evaluation's auc: 0.8181
[28]	Train's auc: 0.83512	Evaluation's auc: 0.818281
[29]	Train's auc: 0.835725	Evaluation's auc: 0.818504
[30]	Train's auc: 0.836014	Evaluation's auc: 0.81854
[31]	Train's auc: 0.836079	Evaluation's auc: 0.818509
[32]	Train's auc: 0.836634	Evaluation's auc: 0.81865
[33]	Train's auc: 0.837196	Evaluation's auc: 0.819024
[34]	Train's auc: 0.837762	Evaluation's auc: 0.81928
[35]	Train's auc: 0.838329	Evaluation's auc: 0.81945
[36]	Train's auc: 0.838825	Evaluation's auc: 0.819662
[37]	Train's auc: 0.839257	Evaluation's auc: 0.819814
[38]	Train's auc: 0.839541	Evaluation's auc: 0.819849
[39]	Train's auc: 0.839757	Evaluation's auc: 0.81994
[40]	Train's auc: 0.840933	Evaluation's auc: 0.820849
[41]	Train's auc: 0.840854	Evaluation's auc: 0.820686
[42]	Train's auc: 0.841132	Evaluation's auc: 0.820574
[43]	Train's auc: 0.841478	Evaluation's auc: 0.821026
[44]	Train's auc: 0.841639	Evaluation's auc: 0.821064
[45]	Train's auc: 0.842001	Evaluation's auc: 0.821152
[46]	Train's auc: 0.842	Evaluation's auc: 0.821085
[47]	Train's auc: 0.842372	Evaluation's auc: 0.821094
[48]	Train's auc: 0.84296	Evaluation's auc: 0.821263
[49]	Train's auc: 0.844149	Evaluation's auc: 0.822157
[50]	Train's auc: 0.844149	Evaluation's auc: 0.822112
[51]	Train's auc: 0.844519	Evaluation's auc: 0.82205
[52]	Train's auc: 0.844729	Evaluation's auc: 0.82208
[53]	Train's auc: 0.845185	Evaluation's auc: 0.822168
[54]	Train's auc: 0.845467	Evaluation's auc: 0.822414
[55]	Train's auc: 0.845684	Evaluation's auc: 0.822545
[56]	Train's auc: 0.846209	Evaluation's auc: 0.82295
[57]	Train's auc: 0.846838	Evaluation's auc: 0.823406
[58]	Train's auc: 0.847131	Evaluation's auc: 0.823508
[59]	Train's auc: 0.848038	Evaluation's auc: 0.824024
[60]	Train's auc: 0.848422	Evaluation's auc: 0.824149
[61]	Train's auc: 0.848735	Evaluation's auc: 0.824082
[62]	Train's auc: 0.848911	Evaluation's auc: 0.824093
[63]	Train's auc: 0.84905	Evaluation's auc: 0.824035
[64]	Train's auc: 0.851456	Evaluation's auc: 0.826905
[65]	Train's auc: 0.851567	Evaluation's auc: 0.82693
[66]	Train's auc: 0.851947	Evaluation's auc: 0.827122
[67]	Train's auc: 0.853692	Evaluation's auc: 0.828835
[68]	Train's auc: 0.855113	Evaluation's auc: 0.830507
[69]	Train's auc: 0.855383	Evaluation's auc: 0.830482
[70]	Train's auc: 0.855739	Evaluation's auc: 0.830699
[71]	Train's auc: 0.856157	Evaluation's auc: 0.830837
[72]	Train's auc: 0.856226	Evaluation's auc: 0.830938
[73]	Train's auc: 0.856736	Evaluation's auc: 0.831198
[74]	Train's auc: 0.857126	Evaluation's auc: 0.831463
[75]	Train's auc: 0.857422	Evaluation's auc: 0.831583
[76]	Train's auc: 0.857871	Evaluation's auc: 0.831625
[77]	Train's auc: 0.858285	Evaluation's auc: 0.831829
[78]	Train's auc: 0.858593	Evaluation's auc: 0.831754
[79]	Train's auc: 0.858994	Evaluation's auc: 0.831857
[80]	Train's auc: 0.859282	Evaluation's auc: 0.831816
[81]	Train's auc: 0.859684	Evaluation's auc: 0.831993
[82]	Train's auc: 0.859852	Evaluation's auc: 0.832231
[83]	Train's auc: 0.859963	Evaluation's auc: 0.832313
[84]	Train's auc: 0.860151	Evaluation's auc: 0.832365
[85]	Train's auc: 0.860685	Evaluation's auc: 0.832408
[86]	Train's auc: 0.86092	Evaluation's auc: 0.83242
[87]	Train's auc: 0.861179	Evaluation's auc: 0.832361
[88]	Train's auc: 0.861533	Evaluation's auc: 0.832389
[89]	Train's auc: 0.861744	Evaluation's auc: 0.8323
[90]	Train's auc: 0.862078	Evaluation's auc: 0.832437
[91]	Train's auc: 0.862187	Evaluation's auc: 0.832456
[92]	Train's auc: 0.863109	Evaluation's auc: 0.833435
[93]	Train's auc: 0.863331	Evaluation's auc: 0.833619
[94]	Train's auc: 0.863691	Evaluation's auc: 0.833746
[95]	Train's auc: 0.863972	Evaluation's auc: 0.833705
[96]	Train's auc: 0.864475	Evaluation's auc: 0.833906
[97]	Train's auc: 0.864532	Evaluation's auc: 0.833859
[98]	Train's auc: 0.864755	Evaluation's auc: 0.833892
[99]	Train's auc: 0.864756	Evaluation's auc: 0.833833
[100]	Train's auc: 0.865151	Evaluation's auc: 0.833899
[101]	Train's auc: 0.865347	Evaluation's auc: 0.83376
[102]	Train's auc: 0.865685	Evaluation's auc: 0.833949
[103]	Train's auc: 0.865976	Evaluation's auc: 0.833955
[104]	Train's auc: 0.866421	Evaluation's auc: 0.834092
[105]	Train's auc: 0.866743	Evaluation's auc: 0.834168
[106]	Train's auc: 0.866971	Evaluation's auc: 0.834223
[107]	Train's auc: 0.867046	Evaluation's auc: 0.834225
[108]	Train's auc: 0.867471	Evaluation's auc: 0.834359
[109]	Train's auc: 0.867721	Evaluation's auc: 0.834382
[110]	Train's auc: 0.867986	Evaluation's auc: 0.834386
[111]	Train's auc: 0.868244	Evaluation's auc: 0.834392
[112]	Train's auc: 0.868524	Evaluation's auc: 0.834399
[113]	Train's auc: 0.868779	Evaluation's auc: 0.834422
[114]	Train's auc: 0.869146	Evaluation's auc: 0.834437
[115]	Train's auc: 0.869429	Evaluation's auc: 0.834676
[116]	Train's auc: 0.869717	Evaluation's auc: 0.834677
[117]	Train's auc: 0.869884	Evaluation's auc: 0.834693
[118]	Train's auc: 0.870763	Evaluation's auc: 0.835696
[119]	Train's auc: 0.87108	Evaluation's auc: 0.83581
[120]	Train's auc: 0.871748	Evaluation's auc: 0.83661
[121]	Train's auc: 0.872016	Evaluation's auc: 0.836827
[122]	Train's auc: 0.872308	Evaluation's auc: 0.836917
[123]	Train's auc: 0.872522	Evaluation's auc: 0.836912
[124]	Train's auc: 0.872751	Evaluation's auc: 0.836892
[125]	Train's auc: 0.872873	Evaluation's auc: 0.836884
[126]	Train's auc: 0.873154	Evaluation's auc: 0.836829
[127]	Train's auc: 0.873423	Evaluation's auc: 0.836976
[128]	Train's auc: 0.87365	Evaluation's auc: 0.836953
[129]	Train's auc: 0.873904	Evaluation's auc: 0.836978
[130]	Train's auc: 0.874123	Evaluation's auc: 0.836944
[131]	Train's auc: 0.874371	Evaluation's auc: 0.836947
[132]	Train's auc: 0.874507	Evaluation's auc: 0.836977
[133]	Train's auc: 0.874742	Evaluation's auc: 0.836881
[134]	Train's auc: 0.875234	Evaluation's auc: 0.837219
[135]	Train's auc: 0.875444	Evaluation's auc: 0.837189
[136]	Train's auc: 0.875593	Evaluation's auc: 0.837238
[137]	Train's auc: 0.875775	Evaluation's auc: 0.837209
[138]	Train's auc: 0.876096	Evaluation's auc: 0.837288
[139]	Train's auc: 0.876178	Evaluation's auc: 0.837303
[140]	Train's auc: 0.876604	Evaluation's auc: 0.837896
[141]	Train's auc: 0.876758	Evaluation's auc: 0.837875
[142]	Train's auc: 0.877036	Evaluation's auc: 0.838003
[143]	Train's auc: 0.877282	Evaluation's auc: 0.838102
[144]	Train's auc: 0.877625	Evaluation's auc: 0.838103
[145]	Train's auc: 0.877855	Evaluation's auc: 0.838112
[146]	Train's auc: 0.878007	Evaluation's auc: 0.838193
[147]	Train's auc: 0.878119	Evaluation's auc: 0.83813
[148]	Train's auc: 0.878375	Evaluation's auc: 0.838147
[149]	Train's auc: 0.878579	Evaluation's auc: 0.838132
[150]	Train's auc: 0.87883	Evaluation's auc: 0.838295
[151]	Train's auc: 0.87906	Evaluation's auc: 0.838285
[152]	Train's auc: 0.879215	Evaluation's auc: 0.838376
[153]	Train's auc: 0.87948	Evaluation's auc: 0.838378
[154]	Train's auc: 0.879725	Evaluation's auc: 0.838416
[155]	Train's auc: 0.879927	Evaluation's auc: 0.838273
[156]	Train's auc: 0.880103	Evaluation's auc: 0.838302
[157]	Train's auc: 0.880249	Evaluation's auc: 0.838351
[158]	Train's auc: 0.88043	Evaluation's auc: 0.838298
[159]	Train's auc: 0.880706	Evaluation's auc: 0.838288
[160]	Train's auc: 0.880897	Evaluation's auc: 0.838317
[161]	Train's auc: 0.881186	Evaluation's auc: 0.838281
[162]	Train's auc: 0.881506	Evaluation's auc: 0.838369
[163]	Train's auc: 0.881681	Evaluation's auc: 0.838444
[164]	Train's auc: 0.88186	Evaluation's auc: 0.838517
[165]	Train's auc: 0.882101	Evaluation's auc: 0.838548
[166]	Train's auc: 0.88234	Evaluation's auc: 0.838594
[167]	Train's auc: 0.88254	Evaluation's auc: 0.83857
[168]	Train's auc: 0.882802	Evaluation's auc: 0.838644
[169]	Train's auc: 0.883033	Evaluation's auc: 0.838657
[170]	Train's auc: 0.883146	Evaluation's auc: 0.838657
[171]	Train's auc: 0.883287	Evaluation's auc: 0.838674
[172]	Train's auc: 0.883522	Evaluation's auc: 0.838746
[173]	Train's auc: 0.883675	Evaluation's auc: 0.838735
[174]	Train's auc: 0.883879	Evaluation's auc: 0.838839
[175]	Train's auc: 0.884049	Evaluation's auc: 0.8389
[176]	Train's auc: 0.88451	Evaluation's auc: 0.83949
[177]	Train's auc: 0.884732	Evaluation's auc: 0.839537
[178]	Train's auc: 0.884922	Evaluation's auc: 0.839479
[179]	Train's auc: 0.885109	Evaluation's auc: 0.839367
[180]	Train's auc: 0.885347	Evaluation's auc: 0.839444
[181]	Train's auc: 0.885536	Evaluation's auc: 0.839404
[182]	Train's auc: 0.885636	Evaluation's auc: 0.839386
[183]	Train's auc: 0.885773	Evaluation's auc: 0.839372
[184]	Train's auc: 0.885881	Evaluation's auc: 0.839397
[185]	Train's auc: 0.88604	Evaluation's auc: 0.839334
[186]	Train's auc: 0.886232	Evaluation's auc: 0.839422
[187]	Train's auc: 0.886408	Evaluation's auc: 0.839428
[188]	Train's auc: 0.886577	Evaluation's auc: 0.839427
[189]	Train's auc: 0.886855	Evaluation's auc: 0.839519
[190]	Train's auc: 0.887018	Evaluation's auc: 0.839715
[191]	Train's auc: 0.887134	Evaluation's auc: 0.839644
[192]	Train's auc: 0.887321	Evaluation's auc: 0.839619
[193]	Train's auc: 0.887536	Evaluation's auc: 0.839654
[194]	Train's auc: 0.88774	Evaluation's auc: 0.839611
[195]	Train's auc: 0.887924	Evaluation's auc: 0.839643
[196]	Train's auc: 0.888149	Evaluation's auc: 0.8397
[197]	Train's auc: 0.888331	Evaluation's auc: 0.839678
[198]	Train's auc: 0.888566	Evaluation's auc: 0.839794
[199]	Train's auc: 0.888749	Evaluation's auc: 0.839736
[200]	Train's auc: 0.888933	Evaluation's auc: 0.839771
[201]	Train's auc: 0.889043	Evaluation's auc: 0.839781
[202]	Train's auc: 0.889268	Evaluation's auc: 0.839732
[203]	Train's auc: 0.889446	Evaluation's auc: 0.839687
[204]	Train's auc: 0.889691	Evaluation's auc: 0.839596
[205]	Train's auc: 0.889839	Evaluation's auc: 0.839608
[206]	Train's auc: 0.890025	Evaluation's auc: 0.839633
[207]	Train's auc: 0.890113	Evaluation's auc: 0.839664
[208]	Train's auc: 0.890204	Evaluation's auc: 0.839711
[209]	Train's auc: 0.8904	Evaluation's auc: 0.839799
[210]	Train's auc: 0.89049	Evaluation's auc: 0.839717
[211]	Train's auc: 0.890677	Evaluation's auc: 0.839702
[212]	Train's auc: 0.891038	Evaluation's auc: 0.840132
[213]	Train's auc: 0.891241	Evaluation's auc: 0.840166
[214]	Train's auc: 0.891397	Evaluation's auc: 0.840176
[215]	Train's auc: 0.891651	Evaluation's auc: 0.84053
[216]	Train's auc: 0.891874	Evaluation's auc: 0.840458
[217]	Train's auc: 0.891904	Evaluation's auc: 0.840473
[218]	Train's auc: 0.892043	Evaluation's auc: 0.840457
[219]	Train's auc: 0.892149	Evaluation's auc: 0.840449
[220]	Train's auc: 0.892309	Evaluation's auc: 0.840387
[221]	Train's auc: 0.892526	Evaluation's auc: 0.840431
[222]	Train's auc: 0.892719	Evaluation's auc: 0.840327
[223]	Train's auc: 0.892919	Evaluation's auc: 0.840302
[224]	Train's auc: 0.892983	Evaluation's auc: 0.840304
[225]	Train's auc: 0.89311	Evaluation's auc: 0.840311
[226]	Train's auc: 0.893213	Evaluation's auc: 0.840284
[227]	Train's auc: 0.893356	Evaluation's auc: 0.840317
[228]	Train's auc: 0.893552	Evaluation's auc: 0.840225
[229]	Train's auc: 0.893729	Evaluation's auc: 0.840236
[230]	Train's auc: 0.893896	Evaluation's auc: 0.840286
[231]	Train's auc: 0.894057	Evaluation's auc: 0.840304
[232]	Train's auc: 0.894168	Evaluation's auc: 0.840354
[233]	Train's auc: 0.89442	Evaluation's auc: 0.84038
[234]	Train's auc: 0.894569	Evaluation's auc: 0.840366
[235]	Train's auc: 0.894741	Evaluation's auc: 0.840306
[236]	Train's auc: 0.894947	Evaluation's auc: 0.840285
[237]	Train's auc: 0.895058	Evaluation's auc: 0.840346
[238]	Train's auc: 0.895221	Evaluation's auc: 0.840339
[239]	Train's auc: 0.895279	Evaluation's auc: 0.84038
[240]	Train's auc: 0.895442	Evaluation's auc: 0.840372
[241]	Train's auc: 0.895648	Evaluation's auc: 0.840752
[242]	Train's auc: 0.895783	Evaluation's auc: 0.840758
[243]	Train's auc: 0.895883	Evaluation's auc: 0.840681
[244]	Train's auc: 0.896102	Evaluation's auc: 0.840646
[245]	Train's auc: 0.89629	Evaluation's auc: 0.840668
[246]	Train's auc: 0.896433	Evaluation's auc: 0.840703
[247]	Train's auc: 0.89664	Evaluation's auc: 0.840675
[248]	Train's auc: 0.896802	Evaluation's auc: 0.840696
[249]	Train's auc: 0.896972	Evaluation's auc: 0.840693
[250]	Train's auc: 0.897126	Evaluation's auc: 0.840691
[251]	Train's auc: 0.897265	Evaluation's auc: 0.840739
[252]	Train's auc: 0.897427	Evaluation's auc: 0.84086
[253]	Train's auc: 0.89763	Evaluation's auc: 0.840867
[254]	Train's auc: 0.897707	Evaluation's auc: 0.840897
[255]	Train's auc: 0.897856	Evaluation's auc: 0.840849
[256]	Train's auc: 0.898116	Evaluation's auc: 0.840982
[257]	Train's auc: 0.898278	Evaluation's auc: 0.841054
[258]	Train's auc: 0.898398	Evaluation's auc: 0.841048
[259]	Train's auc: 0.898547	Evaluation's auc: 0.841056
[260]	Train's auc: 0.898657	Evaluation's auc: 0.841169
[261]	Train's auc: 0.898877	Evaluation's auc: 0.841178
[262]	Train's auc: 0.899072	Evaluation's auc: 0.841145
[263]	Train's auc: 0.899245	Evaluation's auc: 0.841102
[264]	Train's auc: 0.899495	Evaluation's auc: 0.841267
[265]	Train's auc: 0.899697	Evaluation's auc: 0.841249
[266]	Train's auc: 0.899941	Evaluation's auc: 0.841145
[267]	Train's auc: 0.900042	Evaluation's auc: 0.84115
[268]	Train's auc: 0.900158	Evaluation's auc: 0.841102
[269]	Train's auc: 0.90028	Evaluation's auc: 0.841074
[270]	Train's auc: 0.900379	Evaluation's auc: 0.841102
[271]	Train's auc: 0.90047	Evaluation's auc: 0.841109
[272]	Train's auc: 0.900631	Evaluation's auc: 0.841018
[273]	Train's auc: 0.900791	Evaluation's auc: 0.841066
[274]	Train's auc: 0.900869	Evaluation's auc: 0.841047
[275]	Train's auc: 0.901015	Evaluation's auc: 0.841045
[276]	Train's auc: 0.901186	Evaluation's auc: 0.841157
[277]	Train's auc: 0.901307	Evaluation's auc: 0.841179
[278]	Train's auc: 0.901435	Evaluation's auc: 0.84119
[279]	Train's auc: 0.901557	Evaluation's auc: 0.841191
[280]	Train's auc: 0.901706	Evaluation's auc: 0.841212
[281]	Train's auc: 0.901875	Evaluation's auc: 0.841207
[282]	Train's auc: 0.902027	Evaluation's auc: 0.84144
[283]	Train's auc: 0.902162	Evaluation's auc: 0.84146
[284]	Train's auc: 0.902263	Evaluation's auc: 0.841412
[285]	Train's auc: 0.90243	Evaluation's auc: 0.841447
[286]	Train's auc: 0.90255	Evaluation's auc: 0.841431
[287]	Train's auc: 0.902743	Evaluation's auc: 0.841412
[288]	Train's auc: 0.9029	Evaluation's auc: 0.841394
[289]	Train's auc: 0.903076	Evaluation's auc: 0.841308
[290]	Train's auc: 0.903236	Evaluation's auc: 0.841377
[291]	Train's auc: 0.903483	Evaluation's auc: 0.841618
[292]	Train's auc: 0.903564	Evaluation's auc: 0.841603
[293]	Train's auc: 0.903679	Evaluation's auc: 0.841594
[294]	Train's auc: 0.903706	Evaluation's auc: 0.841598
[295]	Train's auc: 0.903784	Evaluation's auc: 0.841609
[296]	Train's auc: 0.903902	Evaluation's auc: 0.841774
[297]	Train's auc: 0.903978	Evaluation's auc: 0.841786
[298]	Train's auc: 0.904183	Evaluation's auc: 0.841751
[299]	Train's auc: 0.904326	Evaluation's auc: 0.841791
[300]	Train's auc: 0.904427	Evaluation's auc: 0.841874
[301]	Train's auc: 0.904578	Evaluation's auc: 0.841847
[302]	Train's auc: 0.904738	Evaluation's auc: 0.841829
[303]	Train's auc: 0.904904	Evaluation's auc: 0.841779
[304]	Train's auc: 0.904981	Evaluation's auc: 0.84181
[305]	Train's auc: 0.905111	Evaluation's auc: 0.841761
[306]	Train's auc: 0.905208	Evaluation's auc: 0.841776
[307]	Train's auc: 0.905357	Evaluation's auc: 0.841795
[308]	Train's auc: 0.905499	Evaluation's auc: 0.841955
[309]	Train's auc: 0.905569	Evaluation's auc: 0.841933
[310]	Train's auc: 0.905696	Evaluation's auc: 0.841949
[311]	Train's auc: 0.905842	Evaluation's auc: 0.841947
[312]	Train's auc: 0.906017	Evaluation's auc: 0.842011
[313]	Train's auc: 0.906127	Evaluation's auc: 0.841987
[314]	Train's auc: 0.906343	Evaluation's auc: 0.84205
[315]	Train's auc: 0.906416	Evaluation's auc: 0.842085
[316]	Train's auc: 0.90655	Evaluation's auc: 0.842088
[317]	Train's auc: 0.906695	Evaluation's auc: 0.842078
[318]	Train's auc: 0.906817	Evaluation's auc: 0.842006
[319]	Train's auc: 0.906987	Evaluation's auc: 0.841898
[320]	Train's auc: 0.907013	Evaluation's auc: 0.841916
[321]	Train's auc: 0.907147	Evaluation's auc: 0.841904
[322]	Train's auc: 0.907301	Evaluation's auc: 0.841894
[323]	Train's auc: 0.907413	Evaluation's auc: 0.841914
[324]	Train's auc: 0.907544	Evaluation's auc: 0.841957
[325]	Train's auc: 0.907597	Evaluation's auc: 0.841956
[326]	Train's auc: 0.907761	Evaluation's auc: 0.841878
[327]	Train's auc: 0.907922	Evaluation's auc: 0.841898
[328]	Train's auc: 0.908092	Evaluation's auc: 0.841883
[329]	Train's auc: 0.908239	Evaluation's auc: 0.841873
[330]	Train's auc: 0.908363	Evaluation's auc: 0.84183
[331]	Train's auc: 0.908514	Evaluation's auc: 0.84187
[332]	Train's auc: 0.908692	Evaluation's auc: 0.84192
[333]	Train's auc: 0.90876	Evaluation's auc: 0.841878
[334]	Train's auc: 0.908866	Evaluation's auc: 0.841825
[335]	Train's auc: 0.90904	Evaluation's auc: 0.841847
[336]	Train's auc: 0.90913	Evaluation's auc: 0.841801
[337]	Train's auc: 0.909268	Evaluation's auc: 0.841798
[338]	Train's auc: 0.909393	Evaluation's auc: 0.841909
[339]	Train's auc: 0.909635	Evaluation's auc: 0.842012
[340]	Train's auc: 0.909767	Evaluation's auc: 0.841998
[341]	Train's auc: 0.909863	Evaluation's auc: 0.842039
[342]	Train's auc: 0.909975	Evaluation's auc: 0.842005
[343]	Train's auc: 0.910125	Evaluation's auc: 0.842155
[344]	Train's auc: 0.910268	Evaluation's auc: 0.842191
[345]	Train's auc: 0.910314	Evaluation's auc: 0.842163
[346]	Train's auc: 0.910443	Evaluation's auc: 0.842135
[347]	Train's auc: 0.910561	Evaluation's auc: 0.842151
[348]	Train's auc: 0.910778	Evaluation's auc: 0.842483
[349]	Train's auc: 0.910872	Evaluation's auc: 0.842458
[350]	Train's auc: 0.910962	Evaluation's auc: 0.842453
[351]	Train's auc: 0.91106	Evaluation's auc: 0.842454
[352]	Train's auc: 0.911173	Evaluation's auc: 0.842427
[353]	Train's auc: 0.91138	Evaluation's auc: 0.842554
[354]	Train's auc: 0.911532	Evaluation's auc: 0.84258
[355]	Train's auc: 0.911581	Evaluation's auc: 0.842573
[356]	Train's auc: 0.911711	Evaluation's auc: 0.842626
[357]	Train's auc: 0.911854	Evaluation's auc: 0.842637
[358]	Train's auc: 0.911974	Evaluation's auc: 0.842674
[359]	Train's auc: 0.912082	Evaluation's auc: 0.842572
[360]	Train's auc: 0.91222	Evaluation's auc: 0.842552
[361]	Train's auc: 0.912354	Evaluation's auc: 0.842519
[362]	Train's auc: 0.912498	Evaluation's auc: 0.842513
[363]	Train's auc: 0.912645	Evaluation's auc: 0.842491
[364]	Train's auc: 0.912829	Evaluation's auc: 0.84251
[365]	Train's auc: 0.913	Evaluation's auc: 0.842551
[366]	Train's auc: 0.913117	Evaluation's auc: 0.842512
[367]	Train's auc: 0.913235	Evaluation's auc: 0.842532
[368]	Train's auc: 0.913362	Evaluation's auc: 0.842515
[369]	Train's auc: 0.913479	Evaluation's auc: 0.842512
[370]	Train's auc: 0.913595	Evaluation's auc: 0.84243
[371]	Train's auc: 0.913703	Evaluation's auc: 0.842464
[372]	Train's auc: 0.913828	Evaluation's auc: 0.842463
[373]	Train's auc: 0.91395	Evaluation's auc: 0.842434
[374]	Train's auc: 0.914082	Evaluation's auc: 0.84241
[375]	Train's auc: 0.914174	Evaluation's auc: 0.842398
[376]	Train's auc: 0.914259	Evaluation's auc: 0.842345
[377]	Train's auc: 0.914385	Evaluation's auc: 0.842333
[378]	Train's auc: 0.914508	Evaluation's auc: 0.8423
[379]	Train's auc: 0.914624	Evaluation's auc: 0.842258
[380]	Train's auc: 0.914744	Evaluation's auc: 0.842187
[381]	Train's auc: 0.914823	Evaluation's auc: 0.842184
[382]	Train's auc: 0.914931	Evaluation's auc: 0.842253
[383]	Train's auc: 0.914989	Evaluation's auc: 0.842227
[384]	Train's auc: 0.915116	Evaluation's auc: 0.842195
[385]	Train's auc: 0.915225	Evaluation's auc: 0.842137
[386]	Train's auc: 0.915369	Evaluation's auc: 0.84214
[387]	Train's auc: 0.915422	Evaluation's auc: 0.84213
[388]	Train's auc: 0.915559	Evaluation's auc: 0.842114
[389]	Train's auc: 0.915704	Evaluation's auc: 0.842124
[390]	Train's auc: 0.915868	Evaluation's auc: 0.842144
[391]	Train's auc: 0.915913	Evaluation's auc: 0.842144
[392]	Train's auc: 0.915988	Evaluation's auc: 0.84214
[393]	Train's auc: 0.916045	Evaluation's auc: 0.842177
[394]	Train's auc: 0.916124	Evaluation's auc: 0.842167
[395]	Train's auc: 0.916304	Evaluation's auc: 0.842213
[396]	Train's auc: 0.91637	Evaluation's auc: 0.842249
[397]	Train's auc: 0.916486	Evaluation's auc: 0.842215
[398]	Train's auc: 0.916538	Evaluation's auc: 0.842231
[399]	Train's auc: 0.916619	Evaluation's auc: 0.842255
[400]	Train's auc: 0.916661	Evaluation's auc: 0.842255
[401]	Train's auc: 0.916738	Evaluation's auc: 0.84225
[402]	Train's auc: 0.916861	Evaluation's auc: 0.842185
[403]	Train's auc: 0.916973	Evaluation's auc: 0.842146
[404]	Train's auc: 0.917067	Evaluation's auc: 0.842125
[405]	Train's auc: 0.917122	Evaluation's auc: 0.842106
[406]	Train's auc: 0.917244	Evaluation's auc: 0.842093
[407]	Train's auc: 0.917382	Evaluation's auc: 0.842056
[408]	Train's auc: 0.917447	Evaluation's auc: 0.84203
[409]	Train's auc: 0.917532	Evaluation's auc: 0.84198
[410]	Train's auc: 0.917664	Evaluation's auc: 0.84199
[411]	Train's auc: 0.917875	Evaluation's auc: 0.841874
[412]	Train's auc: 0.917967	Evaluation's auc: 0.841902
[413]	Train's auc: 0.918101	Evaluation's auc: 0.841877
[414]	Train's auc: 0.918172	Evaluation's auc: 0.841886
[415]	Train's auc: 0.918284	Evaluation's auc: 0.841861
[416]	Train's auc: 0.918397	Evaluation's auc: 0.841829
[417]	Train's auc: 0.918534	Evaluation's auc: 0.841831
[418]	Train's auc: 0.918634	Evaluation's auc: 0.841853
[419]	Train's auc: 0.918795	Evaluation's auc: 0.841811
[420]	Train's auc: 0.918879	Evaluation's auc: 0.841773
[421]	Train's auc: 0.918982	Evaluation's auc: 0.841729
[422]	Train's auc: 0.919135	Evaluation's auc: 0.841724
[423]	Train's auc: 0.919219	Evaluation's auc: 0.841733
[424]	Train's auc: 0.919345	Evaluation's auc: 0.841704
[425]	Train's auc: 0.919455	Evaluation's auc: 0.841704
[426]	Train's auc: 0.919602	Evaluation's auc: 0.841627
[427]	Train's auc: 0.919775	Evaluation's auc: 0.84162
[428]	Train's auc: 0.919846	Evaluation's auc: 0.841611
[429]	Train's auc: 0.919953	Evaluation's auc: 0.841559
[430]	Train's auc: 0.92006	Evaluation's auc: 0.841562
[431]	Train's auc: 0.920171	Evaluation's auc: 0.841593
[432]	Train's auc: 0.920239	Evaluation's auc: 0.841598
[433]	Train's auc: 0.920339	Evaluation's auc: 0.841598
[434]	Train's auc: 0.920422	Evaluation's auc: 0.841619
[435]	Train's auc: 0.920519	Evaluation's auc: 0.841629
[436]	Train's auc: 0.920614	Evaluation's auc: 0.841614
[437]	Train's auc: 0.920734	Evaluation's auc: 0.841582
[438]	Train's auc: 0.920877	Evaluation's auc: 0.841575
[439]	Train's auc: 0.920944	Evaluation's auc: 0.841552
[440]	Train's auc: 0.921047	Evaluation's auc: 0.841478
[441]	Train's auc: 0.92111	Evaluation's auc: 0.841504
[442]	Train's auc: 0.921211	Evaluation's auc: 0.841521
[443]	Train's auc: 0.921367	Evaluation's auc: 0.841477
[444]	Train's auc: 0.92148	Evaluation's auc: 0.841488
[445]	Train's auc: 0.92155	Evaluation's auc: 0.841482
[446]	Train's auc: 0.921611	Evaluation's auc: 0.8415
[447]	Train's auc: 0.921688	Evaluation's auc: 0.841515
[448]	Train's auc: 0.921775	Evaluation's auc: 0.841533
[449]	Train's auc: 0.921933	Evaluation's auc: 0.84148
[450]	Train's auc: 0.92205	Evaluation's auc: 0.841497
[451]	Train's auc: 0.922188	Evaluation's auc: 0.841465
[452]	Train's auc: 0.922261	Evaluation's auc: 0.841467
[453]	Train's auc: 0.922352	Evaluation's auc: 0.841457
[454]	Train's auc: 0.922403	Evaluation's auc: 0.841457
[455]	Train's auc: 0.922501	Evaluation's auc: 0.841399
[456]	Train's auc: 0.922606	Evaluation's auc: 0.841412
[457]	Train's auc: 0.922678	Evaluation's auc: 0.841403
[458]	Train's auc: 0.922805	Evaluation's auc: 0.841373
[459]	Train's auc: 0.922891	Evaluation's auc: 0.841355
[460]	Train's auc: 0.923007	Evaluation's auc: 0.84127
[461]	Train's auc: 0.92304	Evaluation's auc: 0.841273
[462]	Train's auc: 0.923126	Evaluation's auc: 0.841471
[463]	Train's auc: 0.92315	Evaluation's auc: 0.84149
[464]	Train's auc: 0.923257	Evaluation's auc: 0.841456
[465]	Train's auc: 0.923337	Evaluation's auc: 0.841439
[466]	Train's auc: 0.923429	Evaluation's auc: 0.841389
[467]	Train's auc: 0.92352	Evaluation's auc: 0.841453
[468]	Train's auc: 0.923617	Evaluation's auc: 0.841457
[469]	Train's auc: 0.923709	Evaluation's auc: 0.841459
[470]	Train's auc: 0.92378	Evaluation's auc: 0.841411
[471]	Train's auc: 0.92384	Evaluation's auc: 0.841415
[472]	Train's auc: 0.923889	Evaluation's auc: 0.841402
[473]	Train's auc: 0.924001	Evaluation's auc: 0.841417
[474]	Train's auc: 0.924141	Evaluation's auc: 0.841384
[475]	Train's auc: 0.924205	Evaluation's auc: 0.841392
[476]	Train's auc: 0.924295	Evaluation's auc: 0.841454
[477]	Train's auc: 0.924381	Evaluation's auc: 0.841409
[478]	Train's auc: 0.924548	Evaluation's auc: 0.841447
[479]	Train's auc: 0.924644	Evaluation's auc: 0.841414
[480]	Train's auc: 0.924764	Evaluation's auc: 0.841452
[481]	Train's auc: 0.924855	Evaluation's auc: 0.841468
[482]	Train's auc: 0.924887	Evaluation's auc: 0.841453
[483]	Train's auc: 0.924996	Evaluation's auc: 0.841471
[484]	Train's auc: 0.925092	Evaluation's auc: 0.841484
[485]	Train's auc: 0.92519	Evaluation's auc: 0.841479
[486]	Train's auc: 0.925245	Evaluation's auc: 0.84146
[487]	Train's auc: 0.925354	Evaluation's auc: 0.841476
[488]	Train's auc: 0.925433	Evaluation's auc: 0.841491
[489]	Train's auc: 0.92555	Evaluation's auc: 0.841581
[490]	Train's auc: 0.925616	Evaluation's auc: 0.841592
[491]	Train's auc: 0.925699	Evaluation's auc: 0.841561
[492]	Train's auc: 0.925824	Evaluation's auc: 0.84156
[493]	Train's auc: 0.92595	Evaluation's auc: 0.841482
[494]	Train's auc: 0.926045	Evaluation's auc: 0.841477
[495]	Train's auc: 0.92612	Evaluation's auc: 0.841462
[496]	Train's auc: 0.926219	Evaluation's auc: 0.841478
[497]	Train's auc: 0.926265	Evaluation's auc: 0.841466
[498]	Train's auc: 0.926389	Evaluation's auc: 0.841497
[499]	Train's auc: 0.926466	Evaluation's auc: 0.841515
[500]	Train's auc: 0.926561	Evaluation's auc: 0.841514
[501]	Train's auc: 0.926649	Evaluation's auc: 0.841518
[502]	Train's auc: 0.926768	Evaluation's auc: 0.841507
[503]	Train's auc: 0.926837	Evaluation's auc: 0.841545
[504]	Train's auc: 0.926924	Evaluation's auc: 0.841616
[505]	Train's auc: 0.927079	Evaluation's auc: 0.841758
[506]	Train's auc: 0.927225	Evaluation's auc: 0.841727
[507]	Train's auc: 0.927364	Evaluation's auc: 0.84163
[508]	Train's auc: 0.927474	Evaluation's auc: 0.841619
[509]	Train's auc: 0.927584	Evaluation's auc: 0.841552
[510]	Train's auc: 0.927651	Evaluation's auc: 0.841576
[511]	Train's auc: 0.927696	Evaluation's auc: 0.841534
[512]	Train's auc: 0.927797	Evaluation's auc: 0.841556
[513]	Train's auc: 0.927898	Evaluation's auc: 0.84158
[514]	Train's auc: 0.927954	Evaluation's auc: 0.841697
[515]	Train's auc: 0.928074	Evaluation's auc: 0.841683
[516]	Train's auc: 0.928107	Evaluation's auc: 0.841681
[517]	Train's auc: 0.928253	Evaluation's auc: 0.841672
[518]	Train's auc: 0.928268	Evaluation's auc: 0.841645
[519]	Train's auc: 0.928391	Evaluation's auc: 0.841625
[520]	Train's auc: 0.928493	Evaluation's auc: 0.841615
[521]	Train's auc: 0.928594	Evaluation's auc: 0.841595
[522]	Train's auc: 0.928686	Evaluation's auc: 0.84158
[523]	Train's auc: 0.928786	Evaluation's auc: 0.84163
[524]	Train's auc: 0.928845	Evaluation's auc: 0.841621
[525]	Train's auc: 0.928924	Evaluation's auc: 0.841577
[526]	Train's auc: 0.92904	Evaluation's auc: 0.841534
[527]	Train's auc: 0.929126	Evaluation's auc: 0.841697
[528]	Train's auc: 0.929223	Evaluation's auc: 0.84165
[529]	Train's auc: 0.929345	Evaluation's auc: 0.8416
[530]	Train's auc: 0.929448	Evaluation's auc: 0.841575
[531]	Train's auc: 0.929556	Evaluation's auc: 0.841568
[532]	Train's auc: 0.929597	Evaluation's auc: 0.841559
[533]	Train's auc: 0.929634	Evaluation's auc: 0.841562
[534]	Train's auc: 0.929794	Evaluation's auc: 0.841514
[535]	Train's auc: 0.929969	Evaluation's auc: 0.841411
[536]	Train's auc: 0.930006	Evaluation's auc: 0.841411
[537]	Train's auc: 0.930073	Evaluation's auc: 0.841421
[538]	Train's auc: 0.930161	Evaluation's auc: 0.841349
[539]	Train's auc: 0.930263	Evaluation's auc: 0.841339
[540]	Train's auc: 0.930351	Evaluation's auc: 0.841301
[541]	Train's auc: 0.930477	Evaluation's auc: 0.841226
[542]	Train's auc: 0.930564	Evaluation's auc: 0.841213
[543]	Train's auc: 0.930664	Evaluation's auc: 0.841134
[544]	Train's auc: 0.930808	Evaluation's auc: 0.841104
[545]	Train's auc: 0.930947	Evaluation's auc: 0.841099
[546]	Train's auc: 0.931013	Evaluation's auc: 0.841085
[547]	Train's auc: 0.931075	Evaluation's auc: 0.841099
[548]	Train's auc: 0.931198	Evaluation's auc: 0.841174
[549]	Train's auc: 0.931296	Evaluation's auc: 0.841203
[550]	Train's auc: 0.931361	Evaluation's auc: 0.84124
[551]	Train's auc: 0.931444	Evaluation's auc: 0.841211
[552]	Train's auc: 0.93153	Evaluation's auc: 0.841187
[553]	Train's auc: 0.931553	Evaluation's auc: 0.84116
[554]	Train's auc: 0.931643	Evaluation's auc: 0.841177
[555]	Train's auc: 0.931725	Evaluation's auc: 0.841139
[556]	Train's auc: 0.931814	Evaluation's auc: 0.841149
[557]	Train's auc: 0.931877	Evaluation's auc: 0.841135
[558]	Train's auc: 0.931974	Evaluation's auc: 0.841088
[559]	Train's auc: 0.932065	Evaluation's auc: 0.841078
[560]	Train's auc: 0.932168	Evaluation's auc: 0.841073
[561]	Train's auc: 0.932293	Evaluation's auc: 0.841065
[562]	Train's auc: 0.932371	Evaluation's auc: 0.841063
[563]	Train's auc: 0.932458	Evaluation's auc: 0.841016
[564]	Train's auc: 0.932525	Evaluation's auc: 0.841224
[565]	Train's auc: 0.932591	Evaluation's auc: 0.841247
[566]	Train's auc: 0.932645	Evaluation's auc: 0.841281
[567]	Train's auc: 0.932684	Evaluation's auc: 0.841273
[568]	Train's auc: 0.932773	Evaluation's auc: 0.841279
[569]	Train's auc: 0.932826	Evaluation's auc: 0.841236
[570]	Train's auc: 0.932928	Evaluation's auc: 0.84124
[571]	Train's auc: 0.93301	Evaluation's auc: 0.841243
[572]	Train's auc: 0.933078	Evaluation's auc: 0.841327
[573]	Train's auc: 0.933102	Evaluation's auc: 0.841325
[574]	Train's auc: 0.933274	Evaluation's auc: 0.841428
[575]	Train's auc: 0.933326	Evaluation's auc: 0.841372
[576]	Train's auc: 0.933421	Evaluation's auc: 0.841371
[577]	Train's auc: 0.933469	Evaluation's auc: 0.841346
[578]	Train's auc: 0.933564	Evaluation's auc: 0.841416
[579]	Train's auc: 0.933744	Evaluation's auc: 0.841419
[580]	Train's auc: 0.933796	Evaluation's auc: 0.841426
[581]	Train's auc: 0.933894	Evaluation's auc: 0.841451
[582]	Train's auc: 0.933977	Evaluation's auc: 0.841407
[583]	Train's auc: 0.934038	Evaluation's auc: 0.841401
[584]	Train's auc: 0.934129	Evaluation's auc: 0.841464
[585]	Train's auc: 0.934195	Evaluation's auc: 0.841461
[586]	Train's auc: 0.934321	Evaluation's auc: 0.841491
[587]	Train's auc: 0.934365	Evaluation's auc: 0.841473
[588]	Train's auc: 0.934428	Evaluation's auc: 0.841454
[589]	Train's auc: 0.93452	Evaluation's auc: 0.841433
[590]	Train's auc: 0.934601	Evaluation's auc: 0.84142
[591]	Train's auc: 0.934688	Evaluation's auc: 0.84139
[592]	Train's auc: 0.93473	Evaluation's auc: 0.841374
[593]	Train's auc: 0.93482	Evaluation's auc: 0.841366
[594]	Train's auc: 0.9349	Evaluation's auc: 0.841343
[595]	Train's auc: 0.934989	Evaluation's auc: 0.841317
[596]	Train's auc: 0.93509	Evaluation's auc: 0.841308
[597]	Train's auc: 0.935154	Evaluation's auc: 0.841339
[598]	Train's auc: 0.935211	Evaluation's auc: 0.841312
[599]	Train's auc: 0.935287	Evaluation's auc: 0.841304
[600]	Train's auc: 0.935369	Evaluation's auc: 0.841305
[601]	Train's auc: 0.935434	Evaluation's auc: 0.841296
[602]	Train's auc: 0.935501	Evaluation's auc: 0.841328
[603]	Train's auc: 0.935567	Evaluation's auc: 0.841307
[604]	Train's auc: 0.935656	Evaluation's auc: 0.841309
[605]	Train's auc: 0.935765	Evaluation's auc: 0.841315
[606]	Train's auc: 0.935851	Evaluation's auc: 0.841337
[607]	Train's auc: 0.935922	Evaluation's auc: 0.841344
[608]	Train's auc: 0.936018	Evaluation's auc: 0.841317
[609]	Train's auc: 0.936105	Evaluation's auc: 0.841298
[610]	Train's auc: 0.936194	Evaluation's auc: 0.841365
[611]	Train's auc: 0.936306	Evaluation's auc: 0.841339
[612]	Train's auc: 0.936381	Evaluation's auc: 0.841398
[613]	Train's auc: 0.936449	Evaluation's auc: 0.841351
[614]	Train's auc: 0.936532	Evaluation's auc: 0.841335
[615]	Train's auc: 0.936578	Evaluation's auc: 0.841307
[616]	Train's auc: 0.936685	Evaluation's auc: 0.841271
[617]	Train's auc: 0.93677	Evaluation's auc: 0.841268
[618]	Train's auc: 0.936833	Evaluation's auc: 0.841272
[619]	Train's auc: 0.936911	Evaluation's auc: 0.841239
[620]	Train's auc: 0.936989	Evaluation's auc: 0.841181
[621]	Train's auc: 0.93703	Evaluation's auc: 0.84116
[622]	Train's auc: 0.937134	Evaluation's auc: 0.841178
[623]	Train's auc: 0.937184	Evaluation's auc: 0.841152
[624]	Train's auc: 0.93722	Evaluation's auc: 0.841147
[625]	Train's auc: 0.937264	Evaluation's auc: 0.84115
[626]	Train's auc: 0.937317	Evaluation's auc: 0.841103
[627]	Train's auc: 0.93738	Evaluation's auc: 0.841199
[628]	Train's auc: 0.937453	Evaluation's auc: 0.841148
[629]	Train's auc: 0.937506	Evaluation's auc: 0.841116
[630]	Train's auc: 0.937602	Evaluation's auc: 0.841101
[631]	Train's auc: 0.937647	Evaluation's auc: 0.841084
[632]	Train's auc: 0.937751	Evaluation's auc: 0.841104
[633]	Train's auc: 0.937804	Evaluation's auc: 0.841111
[634]	Train's auc: 0.937865	Evaluation's auc: 0.841122
[635]	Train's auc: 0.937956	Evaluation's auc: 0.841131
[636]	Train's auc: 0.938014	Evaluation's auc: 0.841102
[637]	Train's auc: 0.938117	Evaluation's auc: 0.841043
[638]	Train's auc: 0.938205	Evaluation's auc: 0.841023
[639]	Train's auc: 0.93823	Evaluation's auc: 0.840975
[640]	Train's auc: 0.938292	Evaluation's auc: 0.840951
[641]	Train's auc: 0.938368	Evaluation's auc: 0.840931
[642]	Train's auc: 0.93842	Evaluation's auc: 0.840921
[643]	Train's auc: 0.938499	Evaluation's auc: 0.840929
[644]	Train's auc: 0.938576	Evaluation's auc: 0.840931
[645]	Train's auc: 0.938623	Evaluation's auc: 0.84099
[646]	Train's auc: 0.938729	Evaluation's auc: 0.841
[647]	Train's auc: 0.938781	Evaluation's auc: 0.841011
[648]	Train's auc: 0.93883	Evaluation's auc: 0.840968
[649]	Train's auc: 0.938918	Evaluation's auc: 0.84095
[650]	Train's auc: 0.938988	Evaluation's auc: 0.840962
[651]	Train's auc: 0.939082	Evaluation's auc: 0.840962
[652]	Train's auc: 0.939201	Evaluation's auc: 0.840951
[653]	Train's auc: 0.939287	Evaluation's auc: 0.840931
[654]	Train's auc: 0.939355	Evaluation's auc: 0.840877
[655]	Train's auc: 0.939441	Evaluation's auc: 0.840837
[656]	Train's auc: 0.93952	Evaluation's auc: 0.840813
[657]	Train's auc: 0.939574	Evaluation's auc: 0.840809
[658]	Train's auc: 0.939676	Evaluation's auc: 0.840815
[659]	Train's auc: 0.939724	Evaluation's auc: 0.840811
[660]	Train's auc: 0.93981	Evaluation's auc: 0.840785
[661]	Train's auc: 0.93988	Evaluation's auc: 0.840777
[662]	Train's auc: 0.939947	Evaluation's auc: 0.84073
[663]	Train's auc: 0.940038	Evaluation's auc: 0.840782
[664]	Train's auc: 0.940072	Evaluation's auc: 0.840794
[665]	Train's auc: 0.940104	Evaluation's auc: 0.840803
[666]	Train's auc: 0.94021	Evaluation's auc: 0.84075
[667]	Train's auc: 0.940247	Evaluation's auc: 0.840721
[668]	Train's auc: 0.940332	Evaluation's auc: 0.840714
[669]	Train's auc: 0.940345	Evaluation's auc: 0.840689
[670]	Train's auc: 0.940403	Evaluation's auc: 0.840715
[671]	Train's auc: 0.94045	Evaluation's auc: 0.840717
[672]	Train's auc: 0.940509	Evaluation's auc: 0.840721
[673]	Train's auc: 0.940617	Evaluation's auc: 0.840813
[674]	Train's auc: 0.940639	Evaluation's auc: 0.840787
[675]	Train's auc: 0.940768	Evaluation's auc: 0.840784
[676]	Train's auc: 0.940842	Evaluation's auc: 0.840803
[677]	Train's auc: 0.940899	Evaluation's auc: 0.840807
[678]	Train's auc: 0.940972	Evaluation's auc: 0.840743
[679]	Train's auc: 0.941057	Evaluation's auc: 0.840804
[680]	Train's auc: 0.941119	Evaluation's auc: 0.840812
[681]	Train's auc: 0.941195	Evaluation's auc: 0.840932
[682]	Train's auc: 0.941293	Evaluation's auc: 0.840853
[683]	Train's auc: 0.941363	Evaluation's auc: 0.840897
[684]	Train's auc: 0.941432	Evaluation's auc: 0.840888
[685]	Train's auc: 0.941463	Evaluation's auc: 0.840876
[686]	Train's auc: 0.941479	Evaluation's auc: 0.840858
[687]	Train's auc: 0.941559	Evaluation's auc: 0.840865
[688]	Train's auc: 0.941593	Evaluation's auc: 0.840863
[689]	Train's auc: 0.941667	Evaluation's auc: 0.84089
[690]	Train's auc: 0.941752	Evaluation's auc: 0.840892
[691]	Train's auc: 0.941833	Evaluation's auc: 0.840892
[692]	Train's auc: 0.941891	Evaluation's auc: 0.840907
[693]	Train's auc: 0.94195	Evaluation's auc: 0.840891
[694]	Train's auc: 0.942001	Evaluation's auc: 0.8409
[695]	Train's auc: 0.942077	Evaluation's auc: 0.840902
[696]	Train's auc: 0.942121	Evaluation's auc: 0.840837
[697]	Train's auc: 0.942158	Evaluation's auc: 0.840852
[698]	Train's auc: 0.942254	Evaluation's auc: 0.840843
[699]	Train's auc: 0.942297	Evaluation's auc: 0.840811
[700]	Train's auc: 0.942375	Evaluation's auc: 0.84079
[701]	Train's auc: 0.94246	Evaluation's auc: 0.840822
[702]	Train's auc: 0.942539	Evaluation's auc: 0.840804
[703]	Train's auc: 0.942629	Evaluation's auc: 0.840754
[704]	Train's auc: 0.942719	Evaluation's auc: 0.840754
[705]	Train's auc: 0.942815	Evaluation's auc: 0.840741
[706]	Train's auc: 0.942904	Evaluation's auc: 0.84075
[707]	Train's auc: 0.942982	Evaluation's auc: 0.840768
[708]	Train's auc: 0.943043	Evaluation's auc: 0.840754
[709]	Train's auc: 0.943076	Evaluation's auc: 0.840754
[710]	Train's auc: 0.943176	Evaluation's auc: 0.840756
[711]	Train's auc: 0.943226	Evaluation's auc: 0.840743
[712]	Train's auc: 0.943333	Evaluation's auc: 0.840766
[713]	Train's auc: 0.943382	Evaluation's auc: 0.840755
[714]	Train's auc: 0.943454	Evaluation's auc: 0.840748
[715]	Train's auc: 0.943552	Evaluation's auc: 0.84077
[716]	Train's auc: 0.943595	Evaluation's auc: 0.840743
[717]	Train's auc: 0.943607	Evaluation's auc: 0.840769
[718]	Train's auc: 0.94367	Evaluation's auc: 0.840748
[719]	Train's auc: 0.943748	Evaluation's auc: 0.840797
[720]	Train's auc: 0.943823	Evaluation's auc: 0.840897
[721]	Train's auc: 0.943931	Evaluation's auc: 0.840906
[722]	Train's auc: 0.943973	Evaluation's auc: 0.840871
[723]	Train's auc: 0.944063	Evaluation's auc: 0.840853
[724]	Train's auc: 0.9441	Evaluation's auc: 0.840836
[725]	Train's auc: 0.94418	Evaluation's auc: 0.840812
[726]	Train's auc: 0.94424	Evaluation's auc: 0.840775
[727]	Train's auc: 0.944286	Evaluation's auc: 0.840753
[728]	Train's auc: 0.944362	Evaluation's auc: 0.840731
[729]	Train's auc: 0.944374	Evaluation's auc: 0.840727
[730]	Train's auc: 0.944427	Evaluation's auc: 0.840734
[731]	Train's auc: 0.944453	Evaluation's auc: 0.840712
[732]	Train's auc: 0.944498	Evaluation's auc: 0.84068
[733]	Train's auc: 0.944577	Evaluation's auc: 0.840682
[734]	Train's auc: 0.944671	Evaluation's auc: 0.8406
[735]	Train's auc: 0.94472	Evaluation's auc: 0.840607
[736]	Train's auc: 0.944735	Evaluation's auc: 0.840601
[737]	Train's auc: 0.944799	Evaluation's auc: 0.840559
[738]	Train's auc: 0.944899	Evaluation's auc: 0.840606
[739]	Train's auc: 0.944953	Evaluation's auc: 0.840619
[740]	Train's auc: 0.945029	Evaluation's auc: 0.840602
[741]	Train's auc: 0.945101	Evaluation's auc: 0.840601
[742]	Train's auc: 0.945171	Evaluation's auc: 0.840602
[743]	Train's auc: 0.94525	Evaluation's auc: 0.840594
[744]	Train's auc: 0.945294	Evaluation's auc: 0.840595
[745]	Train's auc: 0.945307	Evaluation's auc: 0.840588
[746]	Train's auc: 0.945396	Evaluation's auc: 0.840596
[747]	Train's auc: 0.945405	Evaluation's auc: 0.840588
[748]	Train's auc: 0.945454	Evaluation's auc: 0.840603
[749]	Train's auc: 0.945464	Evaluation's auc: 0.840613
[750]	Train's auc: 0.945506	Evaluation's auc: 0.840612
[751]	Train's auc: 0.945554	Evaluation's auc: 0.840621
[752]	Train's auc: 0.945606	Evaluation's auc: 0.840608
[753]	Train's auc: 0.94566	Evaluation's auc: 0.840631
[754]	Train's auc: 0.945728	Evaluation's auc: 0.840573
[755]	Train's auc: 0.94579	Evaluation's auc: 0.840465
[756]	Train's auc: 0.945857	Evaluation's auc: 0.840453
[757]	Train's auc: 0.945898	Evaluation's auc: 0.840469
[758]	Train's auc: 0.945944	Evaluation's auc: 0.84046
[759]	Train's auc: 0.945962	Evaluation's auc: 0.840446
[760]	Train's auc: 0.946022	Evaluation's auc: 0.840551
[761]	Train's auc: 0.946092	Evaluation's auc: 0.840543
[762]	Train's auc: 0.946213	Evaluation's auc: 0.84047
[763]	Train's auc: 0.946288	Evaluation's auc: 0.840525
[764]	Train's auc: 0.946321	Evaluation's auc: 0.840506
[765]	Train's auc: 0.946384	Evaluation's auc: 0.840526
[766]	Train's auc: 0.946462	Evaluation's auc: 0.840536
[767]	Train's auc: 0.94651	Evaluation's auc: 0.840521
[768]	Train's auc: 0.946542	Evaluation's auc: 0.840527
[769]	Train's auc: 0.946616	Evaluation's auc: 0.840502
[770]	Train's auc: 0.946628	Evaluation's auc: 0.840524
[771]	Train's auc: 0.946683	Evaluation's auc: 0.840485
[772]	Train's auc: 0.946759	Evaluation's auc: 0.840473
[773]	Train's auc: 0.94682	Evaluation's auc: 0.840448
[774]	Train's auc: 0.94684	Evaluation's auc: 0.840443
[775]	Train's auc: 0.94693	Evaluation's auc: 0.840427
[776]	Train's auc: 0.946987	Evaluation's auc: 0.840399
[777]	Train's auc: 0.947054	Evaluation's auc: 0.840423
[778]	Train's auc: 0.947106	Evaluation's auc: 0.840448
[779]	Train's auc: 0.947176	Evaluation's auc: 0.840496
[780]	Train's auc: 0.947258	Evaluation's auc: 0.840505
[781]	Train's auc: 0.947289	Evaluation's auc: 0.840533
[782]	Train's auc: 0.94736	Evaluation's auc: 0.84054
[783]	Train's auc: 0.947432	Evaluation's auc: 0.840531
[784]	Train's auc: 0.947471	Evaluation's auc: 0.840485
[785]	Train's auc: 0.947557	Evaluation's auc: 0.84049
[786]	Train's auc: 0.947612	Evaluation's auc: 0.840444
[787]	Train's auc: 0.947687	Evaluation's auc: 0.840399
[788]	Train's auc: 0.947754	Evaluation's auc: 0.840444
[789]	Train's auc: 0.947793	Evaluation's auc: 0.8405
[790]	Train's auc: 0.947829	Evaluation's auc: 0.840493
[791]	Train's auc: 0.947919	Evaluation's auc: 0.840501
[792]	Train's auc: 0.947965	Evaluation's auc: 0.840426
[793]	Train's auc: 0.948054	Evaluation's auc: 0.840459
[794]	Train's auc: 0.948113	Evaluation's auc: 0.840462
[795]	Train's auc: 0.948175	Evaluation's auc: 0.840479
[796]	Train's auc: 0.948219	Evaluation's auc: 0.840529
[797]	Train's auc: 0.948269	Evaluation's auc: 0.840518
[798]	Train's auc: 0.948311	Evaluation's auc: 0.840508
[799]	Train's auc: 0.948358	Evaluation's auc: 0.840499
[800]	Train's auc: 0.948463	Evaluation's auc: 0.840422
[801]	Train's auc: 0.948511	Evaluation's auc: 0.840424
[802]	Train's auc: 0.948571	Evaluation's auc: 0.840426
[803]	Train's auc: 0.948622	Evaluation's auc: 0.84062
[804]	Train's auc: 0.948702	Evaluation's auc: 0.840589
[805]	Train's auc: 0.948736	Evaluation's auc: 0.840597
[806]	Train's auc: 0.94883	Evaluation's auc: 0.840614
[807]	Train's auc: 0.948887	Evaluation's auc: 0.840602
[808]	Train's auc: 0.94895	Evaluation's auc: 0.840622
[809]	Train's auc: 0.949014	Evaluation's auc: 0.84059
[810]	Train's auc: 0.949101	Evaluation's auc: 0.840626
[811]	Train's auc: 0.949144	Evaluation's auc: 0.84061
[812]	Train's auc: 0.949199	Evaluation's auc: 0.840628
[813]	Train's auc: 0.949233	Evaluation's auc: 0.840597
[814]	Train's auc: 0.949263	Evaluation's auc: 0.84059
[815]	Train's auc: 0.949303	Evaluation's auc: 0.840565
[816]	Train's auc: 0.949356	Evaluation's auc: 0.8406
[817]	Train's auc: 0.949417	Evaluation's auc: 0.840598
[818]	Train's auc: 0.949502	Evaluation's auc: 0.840588
[819]	Train's auc: 0.94958	Evaluation's auc: 0.840639
[820]	Train's auc: 0.949613	Evaluation's auc: 0.840652
[821]	Train's auc: 0.949663	Evaluation's auc: 0.840633
[822]	Train's auc: 0.949718	Evaluation's auc: 0.840662
[823]	Train's auc: 0.949761	Evaluation's auc: 0.840647
[824]	Train's auc: 0.949831	Evaluation's auc: 0.840627
[825]	Train's auc: 0.949894	Evaluation's auc: 0.8406
[826]	Train's auc: 0.949957	Evaluation's auc: 0.840592
[827]	Train's auc: 0.950032	Evaluation's auc: 0.840604
[828]	Train's auc: 0.950086	Evaluation's auc: 0.840582
[829]	Train's auc: 0.950118	Evaluation's auc: 0.840559
[830]	Train's auc: 0.950148	Evaluation's auc: 0.84057
[831]	Train's auc: 0.95021	Evaluation's auc: 0.840535
[832]	Train's auc: 0.950205	Evaluation's auc: 0.840535
[833]	Train's auc: 0.950225	Evaluation's auc: 0.840529
[834]	Train's auc: 0.950282	Evaluation's auc: 0.840569
[835]	Train's auc: 0.950313	Evaluation's auc: 0.840549
[836]	Train's auc: 0.950373	Evaluation's auc: 0.84051
[837]	Train's auc: 0.950428	Evaluation's auc: 0.840495
[838]	Train's auc: 0.950456	Evaluation's auc: 0.840507
[839]	Train's auc: 0.950532	Evaluation's auc: 0.840565
[840]	Train's auc: 0.950583	Evaluation's auc: 0.840539
[841]	Train's auc: 0.950623	Evaluation's auc: 0.840542
[842]	Train's auc: 0.95066	Evaluation's auc: 0.840538
[843]	Train's auc: 0.950705	Evaluation's auc: 0.840547
[844]	Train's auc: 0.950731	Evaluation's auc: 0.840534
[845]	Train's auc: 0.950788	Evaluation's auc: 0.840519
[846]	Train's auc: 0.95084	Evaluation's auc: 0.840537
[847]	Train's auc: 0.950903	Evaluation's auc: 0.840516
[848]	Train's auc: 0.950959	Evaluation's auc: 0.840512
[849]	Train's auc: 0.951023	Evaluation's auc: 0.840486
[850]	Train's auc: 0.951117	Evaluation's auc: 0.840412
[851]	Train's auc: 0.951165	Evaluation's auc: 0.84041
[852]	Train's auc: 0.951218	Evaluation's auc: 0.840389
[853]	Train's auc: 0.951246	Evaluation's auc: 0.840409
[854]	Train's auc: 0.951297	Evaluation's auc: 0.840464
[855]	Train's auc: 0.951372	Evaluation's auc: 0.840436
[856]	Train's auc: 0.951407	Evaluation's auc: 0.84043
[857]	Train's auc: 0.951477	Evaluation's auc: 0.840398
[858]	Train's auc: 0.951515	Evaluation's auc: 0.84035
[859]	Train's auc: 0.951579	Evaluation's auc: 0.840292
[860]	Train's auc: 0.951661	Evaluation's auc: 0.840355
[861]	Train's auc: 0.951712	Evaluation's auc: 0.840355
[862]	Train's auc: 0.95176	Evaluation's auc: 0.840359
[863]	Train's auc: 0.95184	Evaluation's auc: 0.840378
[864]	Train's auc: 0.95188	Evaluation's auc: 0.840415
[865]	Train's auc: 0.9519	Evaluation's auc: 0.840487
[866]	Train's auc: 0.95194	Evaluation's auc: 0.840443
[867]	Train's auc: 0.951981	Evaluation's auc: 0.840436
[868]	Train's auc: 0.952012	Evaluation's auc: 0.840433
[869]	Train's auc: 0.95202	Evaluation's auc: 0.840415
[870]	Train's auc: 0.952088	Evaluation's auc: 0.840388
[871]	Train's auc: 0.952148	Evaluation's auc: 0.840399
[872]	Train's auc: 0.952212	Evaluation's auc: 0.840405
[873]	Train's auc: 0.952246	Evaluation's auc: 0.840426
[874]	Train's auc: 0.952295	Evaluation's auc: 0.840447
[875]	Train's auc: 0.952362	Evaluation's auc: 0.840445
[876]	Train's auc: 0.952402	Evaluation's auc: 0.840454
[877]	Train's auc: 0.952469	Evaluation's auc: 0.840451
[878]	Train's auc: 0.95252	Evaluation's auc: 0.840588
[879]	Train's auc: 0.952575	Evaluation's auc: 0.840589
[880]	Train's auc: 0.952606	Evaluation's auc: 0.84058
[881]	Train's auc: 0.952636	Evaluation's auc: 0.840569
[882]	Train's auc: 0.952657	Evaluation's auc: 0.84056
[883]	Train's auc: 0.952716	Evaluation's auc: 0.840528
[884]	Train's auc: 0.952778	Evaluation's auc: 0.840516
[885]	Train's auc: 0.952851	Evaluation's auc: 0.840584
[886]	Train's auc: 0.952921	Evaluation's auc: 0.840571
[887]	Train's auc: 0.952932	Evaluation's auc: 0.840578
[888]	Train's auc: 0.952972	Evaluation's auc: 0.840575
[889]	Train's auc: 0.952991	Evaluation's auc: 0.840572
[890]	Train's auc: 0.953058	Evaluation's auc: 0.84058
[891]	Train's auc: 0.953136	Evaluation's auc: 0.840588
[892]	Train's auc: 0.953168	Evaluation's auc: 0.840632
[893]	Train's auc: 0.953222	Evaluation's auc: 0.840608
[894]	Train's auc: 0.953255	Evaluation's auc: 0.84064
[895]	Train's auc: 0.953289	Evaluation's auc: 0.840644
[896]	Train's auc: 0.953324	Evaluation's auc: 0.840645
[897]	Train's auc: 0.953365	Evaluation's auc: 0.840626
[898]	Train's auc: 0.953407	Evaluation's auc: 0.840617
[899]	Train's auc: 0.95345	Evaluation's auc: 0.840616
[900]	Train's auc: 0.953476	Evaluation's auc: 0.840588
[901]	Train's auc: 0.953555	Evaluation's auc: 0.840539
[902]	Train's auc: 0.953608	Evaluation's auc: 0.840509
[903]	Train's auc: 0.953655	Evaluation's auc: 0.84049
[904]	Train's auc: 0.953666	Evaluation's auc: 0.840519
[905]	Train's auc: 0.953725	Evaluation's auc: 0.840515
[906]	Train's auc: 0.953763	Evaluation's auc: 0.840499
[907]	Train's auc: 0.953814	Evaluation's auc: 0.840534
[908]	Train's auc: 0.953886	Evaluation's auc: 0.840491
[909]	Train's auc: 0.953931	Evaluation's auc: 0.840488
[910]	Train's auc: 0.953956	Evaluation's auc: 0.840502
[911]	Train's auc: 0.954004	Evaluation's auc: 0.840461
[912]	Train's auc: 0.954058	Evaluation's auc: 0.840454
[913]	Train's auc: 0.954088	Evaluation's auc: 0.840449
[914]	Train's auc: 0.954142	Evaluation's auc: 0.840476
[915]	Train's auc: 0.9542	Evaluation's auc: 0.840424
[916]	Train's auc: 0.954267	Evaluation's auc: 0.840392
[917]	Train's auc: 0.954315	Evaluation's auc: 0.840353
[918]	Train's auc: 0.954318	Evaluation's auc: 0.840358
[919]	Train's auc: 0.954383	Evaluation's auc: 0.840335
[920]	Train's auc: 0.954427	Evaluation's auc: 0.840377
[921]	Train's auc: 0.954492	Evaluation's auc: 0.840424
[922]	Train's auc: 0.954535	Evaluation's auc: 0.840391
[923]	Train's auc: 0.954621	Evaluation's auc: 0.840419
[924]	Train's auc: 0.954694	Evaluation's auc: 0.840437
[925]	Train's auc: 0.954744	Evaluation's auc: 0.840412
[926]	Train's auc: 0.95479	Evaluation's auc: 0.840385
[927]	Train's auc: 0.954908	Evaluation's auc: 0.840504
[928]	Train's auc: 0.954948	Evaluation's auc: 0.840533
[929]	Train's auc: 0.954997	Evaluation's auc: 0.840551
[930]	Train's auc: 0.955047	Evaluation's auc: 0.840521
[931]	Train's auc: 0.955095	Evaluation's auc: 0.840499
[932]	Train's auc: 0.95515	Evaluation's auc: 0.840483
[933]	Train's auc: 0.955208	Evaluation's auc: 0.84049
[934]	Train's auc: 0.955256	Evaluation's auc: 0.84047
[935]	Train's auc: 0.955259	Evaluation's auc: 0.840473
[936]	Train's auc: 0.95532	Evaluation's auc: 0.840428
[937]	Train's auc: 0.955364	Evaluation's auc: 0.840444
[938]	Train's auc: 0.955383	Evaluation's auc: 0.840429
[939]	Train's auc: 0.95543	Evaluation's auc: 0.840447
[940]	Train's auc: 0.955465	Evaluation's auc: 0.840445
[941]	Train's auc: 0.955497	Evaluation's auc: 0.840443
[942]	Train's auc: 0.955546	Evaluation's auc: 0.840416
[943]	Train's auc: 0.955574	Evaluation's auc: 0.840421
[944]	Train's auc: 0.955606	Evaluation's auc: 0.840413
[945]	Train's auc: 0.95566	Evaluation's auc: 0.840337
[946]	Train's auc: 0.955689	Evaluation's auc: 0.840334
[947]	Train's auc: 0.955736	Evaluation's auc: 0.840331
[948]	Train's auc: 0.955803	Evaluation's auc: 0.840319
[949]	Train's auc: 0.955849	Evaluation's auc: 0.840311
[950]	Train's auc: 0.955889	Evaluation's auc: 0.840308
[951]	Train's auc: 0.95592	Evaluation's auc: 0.840304
[952]	Train's auc: 0.955938	Evaluation's auc: 0.840271
[953]	Train's auc: 0.955976	Evaluation's auc: 0.840283
[954]	Train's auc: 0.955981	Evaluation's auc: 0.840282
[955]	Train's auc: 0.956003	Evaluation's auc: 0.840281
[956]	Train's auc: 0.956015	Evaluation's auc: 0.840264
[957]	Train's auc: 0.956052	Evaluation's auc: 0.840253
[958]	Train's auc: 0.95609	Evaluation's auc: 0.840332
[959]	Train's auc: 0.956136	Evaluation's auc: 0.840364
[960]	Train's auc: 0.956149	Evaluation's auc: 0.84034
[961]	Train's auc: 0.9562	Evaluation's auc: 0.840374
[962]	Train's auc: 0.956252	Evaluation's auc: 0.840381
[963]	Train's auc: 0.956271	Evaluation's auc: 0.840356
[964]	Train's auc: 0.956335	Evaluation's auc: 0.840367
[965]	Train's auc: 0.95637	Evaluation's auc: 0.840327
[966]	Train's auc: 0.956428	Evaluation's auc: 0.840332
[967]	Train's auc: 0.956455	Evaluation's auc: 0.840297
[968]	Train's auc: 0.956479	Evaluation's auc: 0.84028
[969]	Train's auc: 0.956526	Evaluation's auc: 0.84025
[970]	Train's auc: 0.956539	Evaluation's auc: 0.840251
[971]	Train's auc: 0.956592	Evaluation's auc: 0.840232
[972]	Train's auc: 0.956659	Evaluation's auc: 0.840219
[973]	Train's auc: 0.95668	Evaluation's auc: 0.840251
[974]	Train's auc: 0.956718	Evaluation's auc: 0.840222
[975]	Train's auc: 0.956773	Evaluation's auc: 0.840218
[976]	Train's auc: 0.956812	Evaluation's auc: 0.840241
[977]	Train's auc: 0.956862	Evaluation's auc: 0.840254
[978]	Train's auc: 0.956938	Evaluation's auc: 0.840261
[979]	Train's auc: 0.956959	Evaluation's auc: 0.840278
[980]	Train's auc: 0.956976	Evaluation's auc: 0.840259
[981]	Train's auc: 0.956983	Evaluation's auc: 0.840252
[982]	Train's auc: 0.956994	Evaluation's auc: 0.840255
[983]	Train's auc: 0.957004	Evaluation's auc: 0.840249
[984]	Train's auc: 0.957051	Evaluation's auc: 0.840216
[985]	Train's auc: 0.957114	Evaluation's auc: 0.840169
[986]	Train's auc: 0.95715	Evaluation's auc: 0.840202
[987]	Train's auc: 0.95716	Evaluation's auc: 0.840193
[988]	Train's auc: 0.957183	Evaluation's auc: 0.840181
[989]	Train's auc: 0.957223	Evaluation's auc: 0.840184
[990]	Train's auc: 0.957273	Evaluation's auc: 0.840206
[991]	Train's auc: 0.957355	Evaluation's auc: 0.840197
[992]	Train's auc: 0.957411	Evaluation's auc: 0.840232
[993]	Train's auc: 0.957453	Evaluation's auc: 0.840268
[994]	Train's auc: 0.957487	Evaluation's auc: 0.840296
[995]	Train's auc: 0.957542	Evaluation's auc: 0.840322
[996]	Train's auc: 0.957579	Evaluation's auc: 0.840292
[997]	Train's auc: 0.957641	Evaluation's auc: 0.840231
[998]	Train's auc: 0.957677	Evaluation's auc: 0.840273
[999]	Train's auc: 0.957706	Evaluation's auc: 0.840244
[1000]	Train's auc: 0.957738	Evaluation's auc: 0.840284
[1001]	Train's auc: 0.957756	Evaluation's auc: 0.840269
[1002]	Train's auc: 0.957802	Evaluation's auc: 0.840252
[1003]	Train's auc: 0.957836	Evaluation's auc: 0.840279
[1004]	Train's auc: 0.957925	Evaluation's auc: 0.840349
[1005]	Train's auc: 0.957987	Evaluation's auc: 0.840352
[1006]	Train's auc: 0.958051	Evaluation's auc: 0.840349
[1007]	Train's auc: 0.958096	Evaluation's auc: 0.840303
[1008]	Train's auc: 0.958136	Evaluation's auc: 0.840284
[1009]	Train's auc: 0.95821	Evaluation's auc: 0.840216
[1010]	Train's auc: 0.958253	Evaluation's auc: 0.840188
[1011]	Train's auc: 0.958307	Evaluation's auc: 0.840084
[1012]	Train's auc: 0.958337	Evaluation's auc: 0.840085
[1013]	Train's auc: 0.95841	Evaluation's auc: 0.840081
[1014]	Train's auc: 0.958491	Evaluation's auc: 0.84007
[1015]	Train's auc: 0.958546	Evaluation's auc: 0.840067
[1016]	Train's auc: 0.958564	Evaluation's auc: 0.840044
[1017]	Train's auc: 0.9586	Evaluation's auc: 0.840022
[1018]	Train's auc: 0.958661	Evaluation's auc: 0.839979
[1019]	Train's auc: 0.958708	Evaluation's auc: 0.839962
[1020]	Train's auc: 0.958728	Evaluation's auc: 0.839957
[1021]	Train's auc: 0.958757	Evaluation's auc: 0.839939
[1022]	Train's auc: 0.958775	Evaluation's auc: 0.839923
[1023]	Train's auc: 0.958833	Evaluation's auc: 0.839939
[1024]	Train's auc: 0.958888	Evaluation's auc: 0.839936
[1025]	Train's auc: 0.958917	Evaluation's auc: 0.839957
[1026]	Train's auc: 0.958953	Evaluation's auc: 0.839974
[1027]	Train's auc: 0.958994	Evaluation's auc: 0.839969
[1028]	Train's auc: 0.95906	Evaluation's auc: 0.83993
[1029]	Train's auc: 0.959108	Evaluation's auc: 0.840013
[1030]	Train's auc: 0.95915	Evaluation's auc: 0.839995
[1031]	Train's auc: 0.959204	Evaluation's auc: 0.839961
[1032]	Train's auc: 0.959246	Evaluation's auc: 0.839962
[1033]	Train's auc: 0.959271	Evaluation's auc: 0.839982
[1034]	Train's auc: 0.95934	Evaluation's auc: 0.840012
[1035]	Train's auc: 0.959385	Evaluation's auc: 0.840001
[1036]	Train's auc: 0.959446	Evaluation's auc: 0.840059
[1037]	Train's auc: 0.959487	Evaluation's auc: 0.840016
[1038]	Train's auc: 0.959518	Evaluation's auc: 0.839993
[1039]	Train's auc: 0.959571	Evaluation's auc: 0.83997
[1040]	Train's auc: 0.959611	Evaluation's auc: 0.839937
[1041]	Train's auc: 0.959653	Evaluation's auc: 0.839925
[1042]	Train's auc: 0.959692	Evaluation's auc: 0.839922
[1043]	Train's auc: 0.959697	Evaluation's auc: 0.839923
[1044]	Train's auc: 0.959757	Evaluation's auc: 0.839912
[1045]	Train's auc: 0.959804	Evaluation's auc: 0.839882
[1046]	Train's auc: 0.959868	Evaluation's auc: 0.839897
[1047]	Train's auc: 0.959913	Evaluation's auc: 0.839885
[1048]	Train's auc: 0.959926	Evaluation's auc: 0.839869
[1049]	Train's auc: 0.959962	Evaluation's auc: 0.839806
[1050]	Train's auc: 0.959994	Evaluation's auc: 0.839825
[1051]	Train's auc: 0.960019	Evaluation's auc: 0.839871
[1052]	Train's auc: 0.960055	Evaluation's auc: 0.839826
[1053]	Train's auc: 0.96013	Evaluation's auc: 0.839766
[1054]	Train's auc: 0.960165	Evaluation's auc: 0.839867
[1055]	Train's auc: 0.960185	Evaluation's auc: 0.839849
[1056]	Train's auc: 0.960226	Evaluation's auc: 0.8398
[1057]	Train's auc: 0.960258	Evaluation's auc: 0.839792
[1058]	Train's auc: 0.960301	Evaluation's auc: 0.839769
[1059]	Train's auc: 0.960343	Evaluation's auc: 0.839749
[1060]	Train's auc: 0.960375	Evaluation's auc: 0.839727
[1061]	Train's auc: 0.960411	Evaluation's auc: 0.839737
[1062]	Train's auc: 0.960422	Evaluation's auc: 0.839723
[1063]	Train's auc: 0.960463	Evaluation's auc: 0.839756
[1064]	Train's auc: 0.960507	Evaluation's auc: 0.839725
[1065]	Train's auc: 0.960534	Evaluation's auc: 0.83971
[1066]	Train's auc: 0.960557	Evaluation's auc: 0.839702
[1067]	Train's auc: 0.960606	Evaluation's auc: 0.839709
[1068]	Train's auc: 0.960661	Evaluation's auc: 0.839663
[1069]	Train's auc: 0.960695	Evaluation's auc: 0.839656
[1070]	Train's auc: 0.960727	Evaluation's auc: 0.839635
[1071]	Train's auc: 0.960748	Evaluation's auc: 0.839656
[1072]	Train's auc: 0.960795	Evaluation's auc: 0.839636
[1073]	Train's auc: 0.960804	Evaluation's auc: 0.839628
[1074]	Train's auc: 0.960823	Evaluation's auc: 0.839646
[1075]	Train's auc: 0.960831	Evaluation's auc: 0.839639
[1076]	Train's auc: 0.960879	Evaluation's auc: 0.839666
[1077]	Train's auc: 0.960939	Evaluation's auc: 0.839639
[1078]	Train's auc: 0.960972	Evaluation's auc: 0.839632
[1079]	Train's auc: 0.961056	Evaluation's auc: 0.839592
[1080]	Train's auc: 0.961085	Evaluation's auc: 0.839588
[1081]	Train's auc: 0.961113	Evaluation's auc: 0.83961
[1082]	Train's auc: 0.961157	Evaluation's auc: 0.839571
[1083]	Train's auc: 0.9612	Evaluation's auc: 0.839579
[1084]	Train's auc: 0.961226	Evaluation's auc: 0.83953
[1085]	Train's auc: 0.961265	Evaluation's auc: 0.839516
[1086]	Train's auc: 0.961311	Evaluation's auc: 0.839531
[1087]	Train's auc: 0.961329	Evaluation's auc: 0.839524
[1088]	Train's auc: 0.961362	Evaluation's auc: 0.839525
[1089]	Train's auc: 0.961397	Evaluation's auc: 0.839509
[1090]	Train's auc: 0.96143	Evaluation's auc: 0.839467
[1091]	Train's auc: 0.961464	Evaluation's auc: 0.839445
[1092]	Train's auc: 0.961471	Evaluation's auc: 0.839437
[1093]	Train's auc: 0.961527	Evaluation's auc: 0.839427
[1094]	Train's auc: 0.961558	Evaluation's auc: 0.839457
[1095]	Train's auc: 0.961606	Evaluation's auc: 0.839417
[1096]	Train's auc: 0.96167	Evaluation's auc: 0.839361
[1097]	Train's auc: 0.96172	Evaluation's auc: 0.839341
[1098]	Train's auc: 0.96177	Evaluation's auc: 0.839343
[1099]	Train's auc: 0.961799	Evaluation's auc: 0.839351
[1100]	Train's auc: 0.961828	Evaluation's auc: 0.839328
[1101]	Train's auc: 0.961866	Evaluation's auc: 0.839366
[1102]	Train's auc: 0.961902	Evaluation's auc: 0.83934
[1103]	Train's auc: 0.961932	Evaluation's auc: 0.839372
[1104]	Train's auc: 0.961961	Evaluation's auc: 0.839362
[1105]	Train's auc: 0.961989	Evaluation's auc: 0.839403
[1106]	Train's auc: 0.962032	Evaluation's auc: 0.839372
[1107]	Train's auc: 0.962074	Evaluation's auc: 0.839352
[1108]	Train's auc: 0.962122	Evaluation's auc: 0.839312
[1109]	Train's auc: 0.962159	Evaluation's auc: 0.83933
[1110]	Train's auc: 0.962185	Evaluation's auc: 0.839316
[1111]	Train's auc: 0.962231	Evaluation's auc: 0.839296
[1112]	Train's auc: 0.962279	Evaluation's auc: 0.839316
[1113]	Train's auc: 0.962317	Evaluation's auc: 0.839268
[1114]	Train's auc: 0.962346	Evaluation's auc: 0.839271
[1115]	Train's auc: 0.962389	Evaluation's auc: 0.839254
[1116]	Train's auc: 0.962451	Evaluation's auc: 0.839226
[1117]	Train's auc: 0.962492	Evaluation's auc: 0.83924
[1118]	Train's auc: 0.962498	Evaluation's auc: 0.839243
[1119]	Train's auc: 0.962541	Evaluation's auc: 0.839193
[1120]	Train's auc: 0.962563	Evaluation's auc: 0.839205
[1121]	Train's auc: 0.96258	Evaluation's auc: 0.839208
[1122]	Train's auc: 0.96261	Evaluation's auc: 0.839202
[1123]	Train's auc: 0.962631	Evaluation's auc: 0.839169
[1124]	Train's auc: 0.962667	Evaluation's auc: 0.839157
[1125]	Train's auc: 0.962676	Evaluation's auc: 0.839179
[1126]	Train's auc: 0.962703	Evaluation's auc: 0.839189
[1127]	Train's auc: 0.962737	Evaluation's auc: 0.839169
[1128]	Train's auc: 0.962748	Evaluation's auc: 0.83916
[1129]	Train's auc: 0.962756	Evaluation's auc: 0.839184
[1130]	Train's auc: 0.962781	Evaluation's auc: 0.839159
[1131]	Train's auc: 0.962831	Evaluation's auc: 0.839126
[1132]	Train's auc: 0.962871	Evaluation's auc: 0.839117
[1133]	Train's auc: 0.962883	Evaluation's auc: 0.839116
[1134]	Train's auc: 0.96294	Evaluation's auc: 0.839114
[1135]	Train's auc: 0.962954	Evaluation's auc: 0.839131
[1136]	Train's auc: 0.962986	Evaluation's auc: 0.839099
[1137]	Train's auc: 0.962993	Evaluation's auc: 0.839101
[1138]	Train's auc: 0.963018	Evaluation's auc: 0.839156
[1139]	Train's auc: 0.963064	Evaluation's auc: 0.839193
[1140]	Train's auc: 0.963082	Evaluation's auc: 0.839192
[1141]	Train's auc: 0.963098	Evaluation's auc: 0.839243
[1142]	Train's auc: 0.96312	Evaluation's auc: 0.83922
[1143]	Train's auc: 0.963132	Evaluation's auc: 0.839213
[1144]	Train's auc: 0.963187	Evaluation's auc: 0.839198
[1145]	Train's auc: 0.963207	Evaluation's auc: 0.83917
[1146]	Train's auc: 0.963224	Evaluation's auc: 0.839165
[1147]	Train's auc: 0.963232	Evaluation's auc: 0.839165
[1148]	Train's auc: 0.96328	Evaluation's auc: 0.839138
[1149]	Train's auc: 0.96334	Evaluation's auc: 0.839154
[1150]	Train's auc: 0.963368	Evaluation's auc: 0.839139
[1151]	Train's auc: 0.963399	Evaluation's auc: 0.839146
[1152]	Train's auc: 0.963432	Evaluation's auc: 0.839114
[1153]	Train's auc: 0.963466	Evaluation's auc: 0.83912
[1154]	Train's auc: 0.963503	Evaluation's auc: 0.839087
[1155]	Train's auc: 0.963514	Evaluation's auc: 0.839079
[1156]	Train's auc: 0.963549	Evaluation's auc: 0.839052
[1157]	Train's auc: 0.963599	Evaluation's auc: 0.839074
[1158]	Train's auc: 0.963664	Evaluation's auc: 0.839037
[1159]	Train's auc: 0.963674	Evaluation's auc: 0.839039
[1160]	Train's auc: 0.963686	Evaluation's auc: 0.839057
[1161]	Train's auc: 0.96374	Evaluation's auc: 0.839032
[1162]	Train's auc: 0.963771	Evaluation's auc: 0.83901
[1163]	Train's auc: 0.963796	Evaluation's auc: 0.839017
[1164]	Train's auc: 0.963828	Evaluation's auc: 0.839028
[1165]	Train's auc: 0.963845	Evaluation's auc: 0.83901
[1166]	Train's auc: 0.963885	Evaluation's auc: 0.838997
[1167]	Train's auc: 0.96391	Evaluation's auc: 0.83898
[1168]	Train's auc: 0.963931	Evaluation's auc: 0.838973
[1169]	Train's auc: 0.963976	Evaluation's auc: 0.83893
[1170]	Train's auc: 0.963989	Evaluation's auc: 0.838908
[1171]	Train's auc: 0.964001	Evaluation's auc: 0.83892
[1172]	Train's auc: 0.964006	Evaluation's auc: 0.838943
[1173]	Train's auc: 0.964042	Evaluation's auc: 0.838945
[1174]	Train's auc: 0.964059	Evaluation's auc: 0.838964
[1175]	Train's auc: 0.964058	Evaluation's auc: 0.838925
[1176]	Train's auc: 0.964082	Evaluation's auc: 0.838944
[1177]	Train's auc: 0.964119	Evaluation's auc: 0.838918
[1178]	Train's auc: 0.964147	Evaluation's auc: 0.838966
[1179]	Train's auc: 0.964198	Evaluation's auc: 0.838952
[1180]	Train's auc: 0.96423	Evaluation's auc: 0.838967
[1181]	Train's auc: 0.964248	Evaluation's auc: 0.83897
[1182]	Train's auc: 0.964286	Evaluation's auc: 0.838943
[1183]	Train's auc: 0.964314	Evaluation's auc: 0.838952
[1184]	Train's auc: 0.964346	Evaluation's auc: 0.838931
[1185]	Train's auc: 0.964365	Evaluation's auc: 0.838931
[1186]	Train's auc: 0.964372	Evaluation's auc: 0.838918
[1187]	Train's auc: 0.964393	Evaluation's auc: 0.838985
[1188]	Train's auc: 0.964417	Evaluation's auc: 0.83898
[1189]	Train's auc: 0.964485	Evaluation's auc: 0.838937
[1190]	Train's auc: 0.964478	Evaluation's auc: 0.838943
[1191]	Train's auc: 0.964531	Evaluation's auc: 0.838897
[1192]	Train's auc: 0.964567	Evaluation's auc: 0.838895
[1193]	Train's auc: 0.964595	Evaluation's auc: 0.838857
[1194]	Train's auc: 0.964629	Evaluation's auc: 0.838844
[1195]	Train's auc: 0.964676	Evaluation's auc: 0.838806
[1196]	Train's auc: 0.964682	Evaluation's auc: 0.838786
[1197]	Train's auc: 0.964721	Evaluation's auc: 0.838795
[1198]	Train's auc: 0.964749	Evaluation's auc: 0.838813
[1199]	Train's auc: 0.964774	Evaluation's auc: 0.838789
[1200]	Train's auc: 0.964807	Evaluation's auc: 0.838792
[1201]	Train's auc: 0.964842	Evaluation's auc: 0.838769
[1202]	Train's auc: 0.964876	Evaluation's auc: 0.83874
[1203]	Train's auc: 0.964933	Evaluation's auc: 0.838751
[1204]	Train's auc: 0.964967	Evaluation's auc: 0.838764
[1205]	Train's auc: 0.964999	Evaluation's auc: 0.838708
[1206]	Train's auc: 0.965025	Evaluation's auc: 0.838721
[1207]	Train's auc: 0.965069	Evaluation's auc: 0.838734
[1208]	Train's auc: 0.965115	Evaluation's auc: 0.838716
[1209]	Train's auc: 0.96512	Evaluation's auc: 0.838779
[1210]	Train's auc: 0.96515	Evaluation's auc: 0.838747
[1211]	Train's auc: 0.965157	Evaluation's auc: 0.838818
[1212]	Train's auc: 0.965157	Evaluation's auc: 0.838816
[1213]	Train's auc: 0.965201	Evaluation's auc: 0.838835
[1214]	Train's auc: 0.965251	Evaluation's auc: 0.838843
[1215]	Train's auc: 0.965308	Evaluation's auc: 0.838846
[1216]	Train's auc: 0.965384	Evaluation's auc: 0.838833
[1217]	Train's auc: 0.965427	Evaluation's auc: 0.838803
[1218]	Train's auc: 0.965448	Evaluation's auc: 0.838795
[1219]	Train's auc: 0.96547	Evaluation's auc: 0.83882
[1220]	Train's auc: 0.965517	Evaluation's auc: 0.83881
[1221]	Train's auc: 0.965563	Evaluation's auc: 0.838841
[1222]	Train's auc: 0.965603	Evaluation's auc: 0.838856
[1223]	Train's auc: 0.965641	Evaluation's auc: 0.838857
[1224]	Train's auc: 0.965691	Evaluation's auc: 0.838864
[1225]	Train's auc: 0.965702	Evaluation's auc: 0.838853
[1226]	Train's auc: 0.965741	Evaluation's auc: 0.838809
[1227]	Train's auc: 0.965775	Evaluation's auc: 0.838759
[1228]	Train's auc: 0.965795	Evaluation's auc: 0.838738
[1229]	Train's auc: 0.965835	Evaluation's auc: 0.838671
[1230]	Train's auc: 0.965866	Evaluation's auc: 0.838684
[1231]	Train's auc: 0.965884	Evaluation's auc: 0.838695
[1232]	Train's auc: 0.9659	Evaluation's auc: 0.838681
[1233]	Train's auc: 0.96595	Evaluation's auc: 0.838667
[1234]	Train's auc: 0.965975	Evaluation's auc: 0.838657
[1235]	Train's auc: 0.965994	Evaluation's auc: 0.838659
[1236]	Train's auc: 0.966013	Evaluation's auc: 0.838659
[1237]	Train's auc: 0.96605	Evaluation's auc: 0.838621
[1238]	Train's auc: 0.966076	Evaluation's auc: 0.838606
[1239]	Train's auc: 0.966119	Evaluation's auc: 0.838641
[1240]	Train's auc: 0.966168	Evaluation's auc: 0.838622
[1241]	Train's auc: 0.966201	Evaluation's auc: 0.838648
[1242]	Train's auc: 0.966213	Evaluation's auc: 0.838644
[1243]	Train's auc: 0.966244	Evaluation's auc: 0.838658
[1244]	Train's auc: 0.966277	Evaluation's auc: 0.838683
[1245]	Train's auc: 0.96632	Evaluation's auc: 0.838702
[1246]	Train's auc: 0.966336	Evaluation's auc: 0.838662
[1247]	Train's auc: 0.966366	Evaluation's auc: 0.838662
[1248]	Train's auc: 0.966412	Evaluation's auc: 0.838669
[1249]	Train's auc: 0.966463	Evaluation's auc: 0.838649
[1250]	Train's auc: 0.966481	Evaluation's auc: 0.838621
[1251]	Train's auc: 0.966515	Evaluation's auc: 0.838615
[1252]	Train's auc: 0.96655	Evaluation's auc: 0.838599
[1253]	Train's auc: 0.966581	Evaluation's auc: 0.838618
[1254]	Train's auc: 0.966616	Evaluation's auc: 0.838614
[1255]	Train's auc: 0.966634	Evaluation's auc: 0.838631
[1256]	Train's auc: 0.966641	Evaluation's auc: 0.838635
[1257]	Train's auc: 0.966665	Evaluation's auc: 0.838627
[1258]	Train's auc: 0.9667	Evaluation's auc: 0.838638
[1259]	Train's auc: 0.966742	Evaluation's auc: 0.838651
[1260]	Train's auc: 0.966763	Evaluation's auc: 0.838628
[1261]	Train's auc: 0.96679	Evaluation's auc: 0.838598
[1262]	Train's auc: 0.966802	Evaluation's auc: 0.838589
[1263]	Train's auc: 0.96686	Evaluation's auc: 0.83857
[1264]	Train's auc: 0.9669	Evaluation's auc: 0.838521
[1265]	Train's auc: 0.966936	Evaluation's auc: 0.838538
[1266]	Train's auc: 0.966979	Evaluation's auc: 0.838471
[1267]	Train's auc: 0.967	Evaluation's auc: 0.83845
[1268]	Train's auc: 0.967014	Evaluation's auc: 0.838477
[1269]	Train's auc: 0.967042	Evaluation's auc: 0.838511
[1270]	Train's auc: 0.967073	Evaluation's auc: 0.838526
[1271]	Train's auc: 0.967084	Evaluation's auc: 0.838527
[1272]	Train's auc: 0.967137	Evaluation's auc: 0.838488
[1273]	Train's auc: 0.96714	Evaluation's auc: 0.838501
[1274]	Train's auc: 0.967165	Evaluation's auc: 0.838512
[1275]	Train's auc: 0.967226	Evaluation's auc: 0.838512
[1276]	Train's auc: 0.967256	Evaluation's auc: 0.83855
[1277]	Train's auc: 0.967271	Evaluation's auc: 0.838545
[1278]	Train's auc: 0.967293	Evaluation's auc: 0.838519
[1279]	Train's auc: 0.967326	Evaluation's auc: 0.838521
[1280]	Train's auc: 0.967343	Evaluation's auc: 0.838492
[1281]	Train's auc: 0.967357	Evaluation's auc: 0.838482
[1282]	Train's auc: 0.967397	Evaluation's auc: 0.838455
[1283]	Train's auc: 0.967441	Evaluation's auc: 0.838453
[1284]	Train's auc: 0.967447	Evaluation's auc: 0.838453
[1285]	Train's auc: 0.967514	Evaluation's auc: 0.838457
[1286]	Train's auc: 0.967559	Evaluation's auc: 0.838496
[1287]	Train's auc: 0.967585	Evaluation's auc: 0.838473
[1288]	Train's auc: 0.967607	Evaluation's auc: 0.838458
[1289]	Train's auc: 0.967677	Evaluation's auc: 0.838493
[1290]	Train's auc: 0.967716	Evaluation's auc: 0.838474
[1291]	Train's auc: 0.967752	Evaluation's auc: 0.838469
[1292]	Train's auc: 0.967786	Evaluation's auc: 0.83846
[1293]	Train's auc: 0.967801	Evaluation's auc: 0.838515
[1294]	Train's auc: 0.967826	Evaluation's auc: 0.838543
[1295]	Train's auc: 0.967864	Evaluation's auc: 0.838524
[1296]	Train's auc: 0.967887	Evaluation's auc: 0.838513
[1297]	Train's auc: 0.967911	Evaluation's auc: 0.838554
[1298]	Train's auc: 0.967924	Evaluation's auc: 0.838573
[1299]	Train's auc: 0.967961	Evaluation's auc: 0.838598
[1300]	Train's auc: 0.967984	Evaluation's auc: 0.838561
[1301]	Train's auc: 0.968005	Evaluation's auc: 0.838554
[1302]	Train's auc: 0.968015	Evaluation's auc: 0.838518
[1303]	Train's auc: 0.96804	Evaluation's auc: 0.838485
[1304]	Train's auc: 0.96807	Evaluation's auc: 0.83844
[1305]	Train's auc: 0.968072	Evaluation's auc: 0.838439
[1306]	Train's auc: 0.968094	Evaluation's auc: 0.838375
[1307]	Train's auc: 0.96812	Evaluation's auc: 0.838357
[1308]	Train's auc: 0.968143	Evaluation's auc: 0.838359
[1309]	Train's auc: 0.968196	Evaluation's auc: 0.838322
[1310]	Train's auc: 0.968231	Evaluation's auc: 0.838359
[1311]	Train's auc: 0.968277	Evaluation's auc: 0.838334
[1312]	Train's auc: 0.968321	Evaluation's auc: 0.838275
[1313]	Train's auc: 0.968338	Evaluation's auc: 0.838268
[1314]	Train's auc: 0.968371	Evaluation's auc: 0.838327
[1315]	Train's auc: 0.968406	Evaluation's auc: 0.838352
[1316]	Train's auc: 0.968434	Evaluation's auc: 0.838393
[1317]	Train's auc: 0.968456	Evaluation's auc: 0.838358
[1318]	Train's auc: 0.968488	Evaluation's auc: 0.838326
[1319]	Train's auc: 0.968503	Evaluation's auc: 0.838308
[1320]	Train's auc: 0.96851	Evaluation's auc: 0.838303
[1321]	Train's auc: 0.968535	Evaluation's auc: 0.838333
[1322]	Train's auc: 0.968548	Evaluation's auc: 0.83831
[1323]	Train's auc: 0.968583	Evaluation's auc: 0.838295
[1324]	Train's auc: 0.968603	Evaluation's auc: 0.83837
[1325]	Train's auc: 0.968652	Evaluation's auc: 0.838352
[1326]	Train's auc: 0.968669	Evaluation's auc: 0.838359
[1327]	Train's auc: 0.968689	Evaluation's auc: 0.838352
[1328]	Train's auc: 0.968721	Evaluation's auc: 0.838344
[1329]	Train's auc: 0.968744	Evaluation's auc: 0.838312
[1330]	Train's auc: 0.968764	Evaluation's auc: 0.838346
[1331]	Train's auc: 0.968782	Evaluation's auc: 0.838346
[1332]	Train's auc: 0.968789	Evaluation's auc: 0.838352
[1333]	Train's auc: 0.96882	Evaluation's auc: 0.838347
[1334]	Train's auc: 0.968843	Evaluation's auc: 0.838391
[1335]	Train's auc: 0.968873	Evaluation's auc: 0.838355
[1336]	Train's auc: 0.96892	Evaluation's auc: 0.838334
[1337]	Train's auc: 0.96893	Evaluation's auc: 0.838335
[1338]	Train's auc: 0.968976	Evaluation's auc: 0.838339
[1339]	Train's auc: 0.968995	Evaluation's auc: 0.838355
[1340]	Train's auc: 0.969036	Evaluation's auc: 0.838355
[1341]	Train's auc: 0.969065	Evaluation's auc: 0.838352
[1342]	Train's auc: 0.969107	Evaluation's auc: 0.83836
[1343]	Train's auc: 0.969113	Evaluation's auc: 0.838353
[1344]	Train's auc: 0.969136	Evaluation's auc: 0.838365
[1345]	Train's auc: 0.969178	Evaluation's auc: 0.838398
[1346]	Train's auc: 0.96919	Evaluation's auc: 0.838408
[1347]	Train's auc: 0.969218	Evaluation's auc: 0.838376
[1348]	Train's auc: 0.969244	Evaluation's auc: 0.838368
[1349]	Train's auc: 0.969272	Evaluation's auc: 0.838338
[1350]	Train's auc: 0.969315	Evaluation's auc: 0.838324
[1351]	Train's auc: 0.969327	Evaluation's auc: 0.838312
[1352]	Train's auc: 0.969347	Evaluation's auc: 0.838296
[1353]	Train's auc: 0.969358	Evaluation's auc: 0.838296
[1354]	Train's auc: 0.969368	Evaluation's auc: 0.838306
[1355]	Train's auc: 0.969376	Evaluation's auc: 0.838302
[1356]	Train's auc: 0.969404	Evaluation's auc: 0.838306
[1357]	Train's auc: 0.969433	Evaluation's auc: 0.838314
[1358]	Train's auc: 0.969432	Evaluation's auc: 0.838333
[1359]	Train's auc: 0.969477	Evaluation's auc: 0.838342
[1360]	Train's auc: 0.969511	Evaluation's auc: 0.838317
[1361]	Train's auc: 0.969553	Evaluation's auc: 0.838455
[1362]	Train's auc: 0.969558	Evaluation's auc: 0.838481
[1363]	Train's auc: 0.969582	Evaluation's auc: 0.838478
[1364]	Train's auc: 0.969596	Evaluation's auc: 0.838479
[1365]	Train's auc: 0.969621	Evaluation's auc: 0.838464
[1366]	Train's auc: 0.969645	Evaluation's auc: 0.838424
[1367]	Train's auc: 0.969671	Evaluation's auc: 0.838419
[1368]	Train's auc: 0.969676	Evaluation's auc: 0.838419
[1369]	Train's auc: 0.969712	Evaluation's auc: 0.838424
[1370]	Train's auc: 0.969745	Evaluation's auc: 0.838432
[1371]	Train's auc: 0.969765	Evaluation's auc: 0.838396
[1372]	Train's auc: 0.969798	Evaluation's auc: 0.838353
[1373]	Train's auc: 0.969843	Evaluation's auc: 0.838327
[1374]	Train's auc: 0.969869	Evaluation's auc: 0.838328
[1375]	Train's auc: 0.969874	Evaluation's auc: 0.838318
[1376]	Train's auc: 0.969894	Evaluation's auc: 0.838332
[1377]	Train's auc: 0.969919	Evaluation's auc: 0.83832
[1378]	Train's auc: 0.969948	Evaluation's auc: 0.838287
[1379]	Train's auc: 0.969949	Evaluation's auc: 0.838287
[1380]	Train's auc: 0.969983	Evaluation's auc: 0.838308
[1381]	Train's auc: 0.97001	Evaluation's auc: 0.838321
[1382]	Train's auc: 0.970051	Evaluation's auc: 0.838308
[1383]	Train's auc: 0.970058	Evaluation's auc: 0.838319
[1384]	Train's auc: 0.970061	Evaluation's auc: 0.838324
[1385]	Train's auc: 0.970075	Evaluation's auc: 0.838328
[1386]	Train's auc: 0.970105	Evaluation's auc: 0.838368
[1387]	Train's auc: 0.970121	Evaluation's auc: 0.838364
[1388]	Train's auc: 0.970129	Evaluation's auc: 0.838366
[1389]	Train's auc: 0.970159	Evaluation's auc: 0.838375
[1390]	Train's auc: 0.970175	Evaluation's auc: 0.838384
[1391]	Train's auc: 0.97019	Evaluation's auc: 0.838408
[1392]	Train's auc: 0.970209	Evaluation's auc: 0.838403
[1393]	Train's auc: 0.970243	Evaluation's auc: 0.838416
[1394]	Train's auc: 0.970279	Evaluation's auc: 0.838385
[1395]	Train's auc: 0.970302	Evaluation's auc: 0.838383
[1396]	Train's auc: 0.970316	Evaluation's auc: 0.838459
[1397]	Train's auc: 0.970336	Evaluation's auc: 0.838442
[1398]	Train's auc: 0.970394	Evaluation's auc: 0.838383
[1399]	Train's auc: 0.97043	Evaluation's auc: 0.838369
[1400]	Train's auc: 0.970436	Evaluation's auc: 0.838372
[1401]	Train's auc: 0.970442	Evaluation's auc: 0.838366
[1402]	Train's auc: 0.970455	Evaluation's auc: 0.838375
[1403]	Train's auc: 0.970471	Evaluation's auc: 0.838372
[1404]	Train's auc: 0.970479	Evaluation's auc: 0.838362
[1405]	Train's auc: 0.970513	Evaluation's auc: 0.838356
[1406]	Train's auc: 0.970512	Evaluation's auc: 0.838363
[1407]	Train's auc: 0.97055	Evaluation's auc: 0.838337
[1408]	Train's auc: 0.970587	Evaluation's auc: 0.838338
[1409]	Train's auc: 0.970612	Evaluation's auc: 0.838287
[1410]	Train's auc: 0.970653	Evaluation's auc: 0.838231
[1411]	Train's auc: 0.970675	Evaluation's auc: 0.838223
[1412]	Train's auc: 0.970686	Evaluation's auc: 0.838199
[1413]	Train's auc: 0.970727	Evaluation's auc: 0.838207
[1414]	Train's auc: 0.97075	Evaluation's auc: 0.838269
[1415]	Train's auc: 0.970758	Evaluation's auc: 0.838271
[1416]	Train's auc: 0.970791	Evaluation's auc: 0.838256
[1417]	Train's auc: 0.970792	Evaluation's auc: 0.838247
[1418]	Train's auc: 0.970826	Evaluation's auc: 0.838257
[1419]	Train's auc: 0.970831	Evaluation's auc: 0.838253
[1420]	Train's auc: 0.970849	Evaluation's auc: 0.838212
[1421]	Train's auc: 0.970884	Evaluation's auc: 0.838213
[1422]	Train's auc: 0.970905	Evaluation's auc: 0.838174
[1423]	Train's auc: 0.970922	Evaluation's auc: 0.838182
[1424]	Train's auc: 0.970933	Evaluation's auc: 0.83817
[1425]	Train's auc: 0.97097	Evaluation's auc: 0.838223
[1426]	Train's auc: 0.970976	Evaluation's auc: 0.838233
[1427]	Train's auc: 0.971001	Evaluation's auc: 0.838241
[1428]	Train's auc: 0.971039	Evaluation's auc: 0.838258
[1429]	Train's auc: 0.971041	Evaluation's auc: 0.838264
[1430]	Train's auc: 0.97106	Evaluation's auc: 0.838262
[1431]	Train's auc: 0.971092	Evaluation's auc: 0.838265
[1432]	Train's auc: 0.971114	Evaluation's auc: 0.838228
[1433]	Train's auc: 0.971136	Evaluation's auc: 0.838222
[1434]	Train's auc: 0.971168	Evaluation's auc: 0.838219
[1435]	Train's auc: 0.971188	Evaluation's auc: 0.838182
[1436]	Train's auc: 0.971204	Evaluation's auc: 0.838255
[1437]	Train's auc: 0.971223	Evaluation's auc: 0.838243
[1438]	Train's auc: 0.971226	Evaluation's auc: 0.838264
[1439]	Train's auc: 0.971245	Evaluation's auc: 0.838229
[1440]	Train's auc: 0.971274	Evaluation's auc: 0.8382
[1441]	Train's auc: 0.971292	Evaluation's auc: 0.838227
[1442]	Train's auc: 0.971304	Evaluation's auc: 0.838211
[1443]	Train's auc: 0.971309	Evaluation's auc: 0.838228
[1444]	Train's auc: 0.971323	Evaluation's auc: 0.838229
[1445]	Train's auc: 0.971339	Evaluation's auc: 0.838201
[1446]	Train's auc: 0.97137	Evaluation's auc: 0.838199
[1447]	Train's auc: 0.971389	Evaluation's auc: 0.838207
[1448]	Train's auc: 0.971422	Evaluation's auc: 0.838194
[1449]	Train's auc: 0.971471	Evaluation's auc: 0.838197
[1450]	Train's auc: 0.971502	Evaluation's auc: 0.838186
[1451]	Train's auc: 0.971523	Evaluation's auc: 0.838171
[1452]	Train's auc: 0.971546	Evaluation's auc: 0.8382
[1453]	Train's auc: 0.971551	Evaluation's auc: 0.838186
[1454]	Train's auc: 0.971578	Evaluation's auc: 0.838191
[1455]	Train's auc: 0.971617	Evaluation's auc: 0.83818
[1456]	Train's auc: 0.971634	Evaluation's auc: 0.838147
[1457]	Train's auc: 0.971668	Evaluation's auc: 0.838158
[1458]	Train's auc: 0.971682	Evaluation's auc: 0.838179
[1459]	Train's auc: 0.97172	Evaluation's auc: 0.838178
[1460]	Train's auc: 0.971752	Evaluation's auc: 0.838171
[1461]	Train's auc: 0.971786	Evaluation's auc: 0.838141
[1462]	Train's auc: 0.971789	Evaluation's auc: 0.83813
[1463]	Train's auc: 0.971821	Evaluation's auc: 0.838089
[1464]	Train's auc: 0.97185	Evaluation's auc: 0.83814
[1465]	Train's auc: 0.971845	Evaluation's auc: 0.838135
[1466]	Train's auc: 0.971864	Evaluation's auc: 0.838124
[1467]	Train's auc: 0.971892	Evaluation's auc: 0.838113
[1468]	Train's auc: 0.971932	Evaluation's auc: 0.838119
[1469]	Train's auc: 0.971953	Evaluation's auc: 0.838087
[1470]	Train's auc: 0.971969	Evaluation's auc: 0.83807
[1471]	Train's auc: 0.971983	Evaluation's auc: 0.838108
[1472]	Train's auc: 0.972017	Evaluation's auc: 0.838126
[1473]	Train's auc: 0.97205	Evaluation's auc: 0.838131
[1474]	Train's auc: 0.972052	Evaluation's auc: 0.838132
[1475]	Train's auc: 0.972073	Evaluation's auc: 0.838156
[1476]	Train's auc: 0.972093	Evaluation's auc: 0.838125
[1477]	Train's auc: 0.972099	Evaluation's auc: 0.838112
[1478]	Train's auc: 0.97211	Evaluation's auc: 0.838092
[1479]	Train's auc: 0.972125	Evaluation's auc: 0.838091
[1480]	Train's auc: 0.972145	Evaluation's auc: 0.838042
[1481]	Train's auc: 0.972175	Evaluation's auc: 0.83805
[1482]	Train's auc: 0.972196	Evaluation's auc: 0.838059
[1483]	Train's auc: 0.972199	Evaluation's auc: 0.838054
[1484]	Train's auc: 0.97222	Evaluation's auc: 0.83806
[1485]	Train's auc: 0.972292	Evaluation's auc: 0.838085
[1486]	Train's auc: 0.972299	Evaluation's auc: 0.838046
[1487]	Train's auc: 0.972329	Evaluation's auc: 0.838049
[1488]	Train's auc: 0.97234	Evaluation's auc: 0.838022
[1489]	Train's auc: 0.972367	Evaluation's auc: 0.838005
[1490]	Train's auc: 0.972369	Evaluation's auc: 0.838041
[1491]	Train's auc: 0.972397	Evaluation's auc: 0.838005
[1492]	Train's auc: 0.972424	Evaluation's auc: 0.838014
[1493]	Train's auc: 0.972467	Evaluation's auc: 0.838004
[1494]	Train's auc: 0.972494	Evaluation's auc: 0.838054
[1495]	Train's auc: 0.972508	Evaluation's auc: 0.838066
[1496]	Train's auc: 0.972534	Evaluation's auc: 0.838056
[1497]	Train's auc: 0.972559	Evaluation's auc: 0.83804
[1498]	Train's auc: 0.972601	Evaluation's auc: 0.837977
[1499]	Train's auc: 0.97263	Evaluation's auc: 0.837954
[1500]	Train's auc: 0.972647	Evaluation's auc: 0.837924
[1501]	Train's auc: 0.972659	Evaluation's auc: 0.837969
[1502]	Train's auc: 0.972679	Evaluation's auc: 0.837917
[1503]	Train's auc: 0.972705	Evaluation's auc: 0.837973
[1504]	Train's auc: 0.972739	Evaluation's auc: 0.837959
[1505]	Train's auc: 0.97275	Evaluation's auc: 0.837958
[1506]	Train's auc: 0.97276	Evaluation's auc: 0.837966
[1507]	Train's auc: 0.972787	Evaluation's auc: 0.837958
[1508]	Train's auc: 0.972813	Evaluation's auc: 0.837921
[1509]	Train's auc: 0.972839	Evaluation's auc: 0.837868
[1510]	Train's auc: 0.97285	Evaluation's auc: 0.837883
[1511]	Train's auc: 0.972874	Evaluation's auc: 0.837836
[1512]	Train's auc: 0.9729	Evaluation's auc: 0.837844
[1513]	Train's auc: 0.972921	Evaluation's auc: 0.837801
[1514]	Train's auc: 0.972923	Evaluation's auc: 0.837805
[1515]	Train's auc: 0.972953	Evaluation's auc: 0.837777
[1516]	Train's auc: 0.972961	Evaluation's auc: 0.837771
[1517]	Train's auc: 0.973005	Evaluation's auc: 0.837743
[1518]	Train's auc: 0.973026	Evaluation's auc: 0.837743
[1519]	Train's auc: 0.973027	Evaluation's auc: 0.83777
[1520]	Train's auc: 0.97305	Evaluation's auc: 0.837767
[1521]	Train's auc: 0.973072	Evaluation's auc: 0.837764
[1522]	Train's auc: 0.973087	Evaluation's auc: 0.837734
[1523]	Train's auc: 0.973111	Evaluation's auc: 0.837697
[1524]	Train's auc: 0.973117	Evaluation's auc: 0.837679
[1525]	Train's auc: 0.973151	Evaluation's auc: 0.837674
[1526]	Train's auc: 0.97318	Evaluation's auc: 0.837661
[1527]	Train's auc: 0.973194	Evaluation's auc: 0.837618
[1528]	Train's auc: 0.973205	Evaluation's auc: 0.837607
[1529]	Train's auc: 0.973221	Evaluation's auc: 0.837596
[1530]	Train's auc: 0.973247	Evaluation's auc: 0.837617
[1531]	Train's auc: 0.973256	Evaluation's auc: 0.837645
[1532]	Train's auc: 0.973268	Evaluation's auc: 0.837634
[1533]	Train's auc: 0.973298	Evaluation's auc: 0.837617
[1534]	Train's auc: 0.97332	Evaluation's auc: 0.837613
[1535]	Train's auc: 0.973324	Evaluation's auc: 0.837626
[1536]	Train's auc: 0.973341	Evaluation's auc: 0.837603
[1537]	Train's auc: 0.973388	Evaluation's auc: 0.837692
[1538]	Train's auc: 0.973404	Evaluation's auc: 0.837685
[1539]	Train's auc: 0.973434	Evaluation's auc: 0.837673
[1540]	Train's auc: 0.973446	Evaluation's auc: 0.837674
[1541]	Train's auc: 0.973459	Evaluation's auc: 0.837657
[1542]	Train's auc: 0.973473	Evaluation's auc: 0.837674
[1543]	Train's auc: 0.973494	Evaluation's auc: 0.837663
[1544]	Train's auc: 0.973499	Evaluation's auc: 0.837666
[1545]	Train's auc: 0.973525	Evaluation's auc: 0.837664
[1546]	Train's auc: 0.973537	Evaluation's auc: 0.837662
[1547]	Train's auc: 0.973553	Evaluation's auc: 0.837619
[1548]	Train's auc: 0.973568	Evaluation's auc: 0.837619
[1549]	Train's auc: 0.973586	Evaluation's auc: 0.837628
[1550]	Train's auc: 0.973603	Evaluation's auc: 0.837619
[1551]	Train's auc: 0.973632	Evaluation's auc: 0.837608
[1552]	Train's auc: 0.973645	Evaluation's auc: 0.837607
[1553]	Train's auc: 0.973666	Evaluation's auc: 0.837586
[1554]	Train's auc: 0.973679	Evaluation's auc: 0.837618
[1555]	Train's auc: 0.973682	Evaluation's auc: 0.837609
[1556]	Train's auc: 0.973725	Evaluation's auc: 0.837631
[1557]	Train's auc: 0.973754	Evaluation's auc: 0.837584
[1558]	Train's auc: 0.973775	Evaluation's auc: 0.837582
[1559]	Train's auc: 0.973792	Evaluation's auc: 0.837562
[1560]	Train's auc: 0.973816	Evaluation's auc: 0.83755
[1561]	Train's auc: 0.973829	Evaluation's auc: 0.83754
[1562]	Train's auc: 0.973868	Evaluation's auc: 0.837534
[1563]	Train's auc: 0.973876	Evaluation's auc: 0.837527
[1564]	Train's auc: 0.973888	Evaluation's auc: 0.837519
[1565]	Train's auc: 0.973911	Evaluation's auc: 0.837474
[1566]	Train's auc: 0.973935	Evaluation's auc: 0.837467
[1567]	Train's auc: 0.973937	Evaluation's auc: 0.837464
[1568]	Train's auc: 0.973976	Evaluation's auc: 0.837468
[1569]	Train's auc: 0.973997	Evaluation's auc: 0.837464
[1570]	Train's auc: 0.973998	Evaluation's auc: 0.837469
[1571]	Train's auc: 0.974017	Evaluation's auc: 0.837459
[1572]	Train's auc: 0.974025	Evaluation's auc: 0.837457
[1573]	Train's auc: 0.974062	Evaluation's auc: 0.837449
[1574]	Train's auc: 0.974074	Evaluation's auc: 0.837458
[1575]	Train's auc: 0.974091	Evaluation's auc: 0.837466
[1576]	Train's auc: 0.974109	Evaluation's auc: 0.83743
[1577]	Train's auc: 0.974134	Evaluation's auc: 0.837467
[1578]	Train's auc: 0.974161	Evaluation's auc: 0.837475
[1579]	Train's auc: 0.974185	Evaluation's auc: 0.837463
[1580]	Train's auc: 0.974193	Evaluation's auc: 0.837465
[1581]	Train's auc: 0.974213	Evaluation's auc: 0.837416
[1582]	Train's auc: 0.974233	Evaluation's auc: 0.837392
[1583]	Train's auc: 0.974265	Evaluation's auc: 0.8374
[1584]	Train's auc: 0.974292	Evaluation's auc: 0.837381
[1585]	Train's auc: 0.974303	Evaluation's auc: 0.837369
[1586]	Train's auc: 0.974328	Evaluation's auc: 0.837362
[1587]	Train's auc: 0.974356	Evaluation's auc: 0.837391
[1588]	Train's auc: 0.97436	Evaluation's auc: 0.837404
[1589]	Train's auc: 0.974381	Evaluation's auc: 0.837414
[1590]	Train's auc: 0.974391	Evaluation's auc: 0.837412
[1591]	Train's auc: 0.974397	Evaluation's auc: 0.837394
[1592]	Train's auc: 0.974405	Evaluation's auc: 0.837399
[1593]	Train's auc: 0.974422	Evaluation's auc: 0.837386
[1594]	Train's auc: 0.974427	Evaluation's auc: 0.837424
[1595]	Train's auc: 0.974442	Evaluation's auc: 0.837443
[1596]	Train's auc: 0.974471	Evaluation's auc: 0.8374
[1597]	Train's auc: 0.974481	Evaluation's auc: 0.837427
[1598]	Train's auc: 0.974494	Evaluation's auc: 0.837432
[1599]	Train's auc: 0.974524	Evaluation's auc: 0.837427
[1600]	Train's auc: 0.974546	Evaluation's auc: 0.837439
[1601]	Train's auc: 0.974568	Evaluation's auc: 0.83743
[1602]	Train's auc: 0.974578	Evaluation's auc: 0.837446
[1603]	Train's auc: 0.974605	Evaluation's auc: 0.837424
[1604]	Train's auc: 0.974623	Evaluation's auc: 0.837425
[1605]	Train's auc: 0.974634	Evaluation's auc: 0.837441
[1606]	Train's auc: 0.974647	Evaluation's auc: 0.837441
[1607]	Train's auc: 0.974653	Evaluation's auc: 0.837434
[1608]	Train's auc: 0.97468	Evaluation's auc: 0.837461
[1609]	Train's auc: 0.974698	Evaluation's auc: 0.837458
[1610]	Train's auc: 0.974737	Evaluation's auc: 0.837473
[1611]	Train's auc: 0.974756	Evaluation's auc: 0.837424
[1612]	Train's auc: 0.974759	Evaluation's auc: 0.837429
[1613]	Train's auc: 0.974779	Evaluation's auc: 0.837468
[1614]	Train's auc: 0.974787	Evaluation's auc: 0.837485
[1615]	Train's auc: 0.974813	Evaluation's auc: 0.837505
[1616]	Train's auc: 0.974852	Evaluation's auc: 0.837488
[1617]	Train's auc: 0.974859	Evaluation's auc: 0.837465
[1618]	Train's auc: 0.97488	Evaluation's auc: 0.837446
[1619]	Train's auc: 0.9749	Evaluation's auc: 0.837456
[1620]	Train's auc: 0.974913	Evaluation's auc: 0.837454
[1621]	Train's auc: 0.974933	Evaluation's auc: 0.83742
[1622]	Train's auc: 0.974941	Evaluation's auc: 0.837414
[1623]	Train's auc: 0.974964	Evaluation's auc: 0.837432
[1624]	Train's auc: 0.974969	Evaluation's auc: 0.837439
[1625]	Train's auc: 0.974996	Evaluation's auc: 0.837411
[1626]	Train's auc: 0.97501	Evaluation's auc: 0.837436
[1627]	Train's auc: 0.975035	Evaluation's auc: 0.837439
[1628]	Train's auc: 0.975053	Evaluation's auc: 0.837464
[1629]	Train's auc: 0.975059	Evaluation's auc: 0.837456
[1630]	Train's auc: 0.97507	Evaluation's auc: 0.837441
[1631]	Train's auc: 0.975109	Evaluation's auc: 0.83739
[1632]	Train's auc: 0.975119	Evaluation's auc: 0.837369
[1633]	Train's auc: 0.975156	Evaluation's auc: 0.837306
[1634]	Train's auc: 0.975191	Evaluation's auc: 0.837293
[1635]	Train's auc: 0.97521	Evaluation's auc: 0.837286
[1636]	Train's auc: 0.975234	Evaluation's auc: 0.837292
[1637]	Train's auc: 0.975253	Evaluation's auc: 0.83732
[1638]	Train's auc: 0.975258	Evaluation's auc: 0.837314
[1639]	Train's auc: 0.975293	Evaluation's auc: 0.83733
[1640]	Train's auc: 0.975309	Evaluation's auc: 0.837313
[1641]	Train's auc: 0.975342	Evaluation's auc: 0.837326
[1642]	Train's auc: 0.975377	Evaluation's auc: 0.837313
[1643]	Train's auc: 0.975402	Evaluation's auc: 0.83731
[1644]	Train's auc: 0.975422	Evaluation's auc: 0.837318
[1645]	Train's auc: 0.975441	Evaluation's auc: 0.837306
[1646]	Train's auc: 0.975438	Evaluation's auc: 0.837311
[1647]	Train's auc: 0.975452	Evaluation's auc: 0.837307
[1648]	Train's auc: 0.975457	Evaluation's auc: 0.837305
[1649]	Train's auc: 0.975465	Evaluation's auc: 0.837307
[1650]	Train's auc: 0.975471	Evaluation's auc: 0.837344
[1651]	Train's auc: 0.975476	Evaluation's auc: 0.837336
[1652]	Train's auc: 0.975482	Evaluation's auc: 0.837316
[1653]	Train's auc: 0.975494	Evaluation's auc: 0.837309
[1654]	Train's auc: 0.97553	Evaluation's auc: 0.837312
[1655]	Train's auc: 0.975545	Evaluation's auc: 0.837299
[1656]	Train's auc: 0.975558	Evaluation's auc: 0.837273
[1657]	Train's auc: 0.97557	Evaluation's auc: 0.837271
[1658]	Train's auc: 0.975594	Evaluation's auc: 0.837213
[1659]	Train's auc: 0.975612	Evaluation's auc: 0.837193
[1660]	Train's auc: 0.975628	Evaluation's auc: 0.837189
[1661]	Train's auc: 0.975643	Evaluation's auc: 0.837191
[1662]	Train's auc: 0.975665	Evaluation's auc: 0.837161
[1663]	Train's auc: 0.975704	Evaluation's auc: 0.837181
[1664]	Train's auc: 0.975716	Evaluation's auc: 0.837158
[1665]	Train's auc: 0.97573	Evaluation's auc: 0.837175
[1666]	Train's auc: 0.975755	Evaluation's auc: 0.837169
[1667]	Train's auc: 0.97577	Evaluation's auc: 0.837128
[1668]	Train's auc: 0.975782	Evaluation's auc: 0.837123
[1669]	Train's auc: 0.975787	Evaluation's auc: 0.83713
[1670]	Train's auc: 0.975806	Evaluation's auc: 0.837138
[1671]	Train's auc: 0.975816	Evaluation's auc: 0.837161
[1672]	Train's auc: 0.975829	Evaluation's auc: 0.837166
[1673]	Train's auc: 0.975841	Evaluation's auc: 0.837135
[1674]	Train's auc: 0.975836	Evaluation's auc: 0.837141
[1675]	Train's auc: 0.97585	Evaluation's auc: 0.837127
[1676]	Train's auc: 0.975867	Evaluation's auc: 0.837109
[1677]	Train's auc: 0.975882	Evaluation's auc: 0.837118
[1678]	Train's auc: 0.97591	Evaluation's auc: 0.837091
[1679]	Train's auc: 0.975918	Evaluation's auc: 0.837096
[1680]	Train's auc: 0.975929	Evaluation's auc: 0.83714
[1681]	Train's auc: 0.975959	Evaluation's auc: 0.837146
[1682]	Train's auc: 0.975976	Evaluation's auc: 0.837137
[1683]	Train's auc: 0.975983	Evaluation's auc: 0.837139
[1684]	Train's auc: 0.97599	Evaluation's auc: 0.83714
[1685]	Train's auc: 0.97601	Evaluation's auc: 0.837105
[1686]	Train's auc: 0.976032	Evaluation's auc: 0.837071
[1687]	Train's auc: 0.976056	Evaluation's auc: 0.837043
[1688]	Train's auc: 0.976075	Evaluation's auc: 0.837082
[1689]	Train's auc: 0.976081	Evaluation's auc: 0.837071
[1690]	Train's auc: 0.976094	Evaluation's auc: 0.837044
[1691]	Train's auc: 0.976099	Evaluation's auc: 0.83704
[1692]	Train's auc: 0.976127	Evaluation's auc: 0.837024
[1693]	Train's auc: 0.976145	Evaluation's auc: 0.837022
[1694]	Train's auc: 0.97616	Evaluation's auc: 0.836989
[1695]	Train's auc: 0.976163	Evaluation's auc: 0.836995
[1696]	Train's auc: 0.976169	Evaluation's auc: 0.837006
[1697]	Train's auc: 0.976191	Evaluation's auc: 0.836965
[1698]	Train's auc: 0.976226	Evaluation's auc: 0.836957
[1699]	Train's auc: 0.976239	Evaluation's auc: 0.83696
[1700]	Train's auc: 0.976276	Evaluation's auc: 0.836956
[1701]	Train's auc: 0.976301	Evaluation's auc: 0.836978
[1702]	Train's auc: 0.976322	Evaluation's auc: 0.836993
[1703]	Train's auc: 0.976343	Evaluation's auc: 0.836975
[1704]	Train's auc: 0.976371	Evaluation's auc: 0.836977
[1705]	Train's auc: 0.976384	Evaluation's auc: 0.836931
[1706]	Train's auc: 0.97639	Evaluation's auc: 0.836919
[1707]	Train's auc: 0.976414	Evaluation's auc: 0.836923
[1708]	Train's auc: 0.976426	Evaluation's auc: 0.836918
[1709]	Train's auc: 0.976438	Evaluation's auc: 0.836928
[1710]	Train's auc: 0.976457	Evaluation's auc: 0.83687
[1711]	Train's auc: 0.976477	Evaluation's auc: 0.836858
[1712]	Train's auc: 0.976479	Evaluation's auc: 0.836853
[1713]	Train's auc: 0.976478	Evaluation's auc: 0.836852
[1714]	Train's auc: 0.976477	Evaluation's auc: 0.836854
[1715]	Train's auc: 0.976497	Evaluation's auc: 0.836911
[1716]	Train's auc: 0.976507	Evaluation's auc: 0.836896
[1717]	Train's auc: 0.97652	Evaluation's auc: 0.836891
[1718]	Train's auc: 0.976539	Evaluation's auc: 0.836869
[1719]	Train's auc: 0.976545	Evaluation's auc: 0.836878
[1720]	Train's auc: 0.976557	Evaluation's auc: 0.836929
[1721]	Train's auc: 0.976587	Evaluation's auc: 0.836907
[1722]	Train's auc: 0.97661	Evaluation's auc: 0.836914
[1723]	Train's auc: 0.976638	Evaluation's auc: 0.8369
[1724]	Train's auc: 0.976653	Evaluation's auc: 0.836915
[1725]	Train's auc: 0.976662	Evaluation's auc: 0.836907
[1726]	Train's auc: 0.976668	Evaluation's auc: 0.836902
[1727]	Train's auc: 0.976681	Evaluation's auc: 0.836907
[1728]	Train's auc: 0.976693	Evaluation's auc: 0.836916
[1729]	Train's auc: 0.976709	Evaluation's auc: 0.836907
[1730]	Train's auc: 0.976733	Evaluation's auc: 0.836905
[1731]	Train's auc: 0.976752	Evaluation's auc: 0.836911
[1732]	Train's auc: 0.976772	Evaluation's auc: 0.836923
[1733]	Train's auc: 0.976778	Evaluation's auc: 0.836913
[1734]	Train's auc: 0.976785	Evaluation's auc: 0.836888
[1735]	Train's auc: 0.976799	Evaluation's auc: 0.83689
[1736]	Train's auc: 0.976804	Evaluation's auc: 0.836878
[1737]	Train's auc: 0.976828	Evaluation's auc: 0.836859
[1738]	Train's auc: 0.97683	Evaluation's auc: 0.836858
[1739]	Train's auc: 0.976845	Evaluation's auc: 0.836862
[1740]	Train's auc: 0.976848	Evaluation's auc: 0.836864
[1741]	Train's auc: 0.976863	Evaluation's auc: 0.836855
[1742]	Train's auc: 0.97688	Evaluation's auc: 0.836855
[1743]	Train's auc: 0.976897	Evaluation's auc: 0.836864
[1744]	Train's auc: 0.976907	Evaluation's auc: 0.836856
[1745]	Train's auc: 0.976943	Evaluation's auc: 0.836807
[1746]	Train's auc: 0.97696	Evaluation's auc: 0.836788
[1747]	Train's auc: 0.976958	Evaluation's auc: 0.836786
[1748]	Train's auc: 0.976984	Evaluation's auc: 0.836793
[1749]	Train's auc: 0.976997	Evaluation's auc: 0.836794
[1750]	Train's auc: 0.977017	Evaluation's auc: 0.836814
[1751]	Train's auc: 0.977033	Evaluation's auc: 0.836836
[1752]	Train's auc: 0.977059	Evaluation's auc: 0.83682
[1753]	Train's auc: 0.977073	Evaluation's auc: 0.836793
[1754]	Train's auc: 0.977088	Evaluation's auc: 0.83679
[1755]	Train's auc: 0.977105	Evaluation's auc: 0.836763
[1756]	Train's auc: 0.977125	Evaluation's auc: 0.836751
[1757]	Train's auc: 0.977143	Evaluation's auc: 0.836749
[1758]	Train's auc: 0.977163	Evaluation's auc: 0.836769
[1759]	Train's auc: 0.977177	Evaluation's auc: 0.836711
[1760]	Train's auc: 0.9772	Evaluation's auc: 0.836704
[1761]	Train's auc: 0.97721	Evaluation's auc: 0.836734
[1762]	Train's auc: 0.977222	Evaluation's auc: 0.836731
[1763]	Train's auc: 0.977244	Evaluation's auc: 0.836703
[1764]	Train's auc: 0.977274	Evaluation's auc: 0.836723
[1765]	Train's auc: 0.977289	Evaluation's auc: 0.836687
[1766]	Train's auc: 0.977305	Evaluation's auc: 0.836667
[1767]	Train's auc: 0.977318	Evaluation's auc: 0.836654
[1768]	Train's auc: 0.977329	Evaluation's auc: 0.836648
[1769]	Train's auc: 0.977345	Evaluation's auc: 0.836651
[1770]	Train's auc: 0.977347	Evaluation's auc: 0.836652
[1771]	Train's auc: 0.97736	Evaluation's auc: 0.836618
[1772]	Train's auc: 0.977371	Evaluation's auc: 0.836603
[1773]	Train's auc: 0.977377	Evaluation's auc: 0.836606
[1774]	Train's auc: 0.977391	Evaluation's auc: 0.836595
[1775]	Train's auc: 0.977413	Evaluation's auc: 0.836593
[1776]	Train's auc: 0.977428	Evaluation's auc: 0.836593
[1777]	Train's auc: 0.977459	Evaluation's auc: 0.836502
[1778]	Train's auc: 0.97748	Evaluation's auc: 0.836471
[1779]	Train's auc: 0.97748	Evaluation's auc: 0.836491
[1780]	Train's auc: 0.977502	Evaluation's auc: 0.836523
[1781]	Train's auc: 0.977512	Evaluation's auc: 0.836501
[1782]	Train's auc: 0.977511	Evaluation's auc: 0.83651
[1783]	Train's auc: 0.977526	Evaluation's auc: 0.83648
[1784]	Train's auc: 0.977534	Evaluation's auc: 0.83648
[1785]	Train's auc: 0.977535	Evaluation's auc: 0.836479
[1786]	Train's auc: 0.977547	Evaluation's auc: 0.836484
[1787]	Train's auc: 0.977557	Evaluation's auc: 0.836509
[1788]	Train's auc: 0.977585	Evaluation's auc: 0.836529
[1789]	Train's auc: 0.977607	Evaluation's auc: 0.8365
[1790]	Train's auc: 0.977618	Evaluation's auc: 0.836464
[1791]	Train's auc: 0.977631	Evaluation's auc: 0.836456
[1792]	Train's auc: 0.97765	Evaluation's auc: 0.836437
[1793]	Train's auc: 0.977655	Evaluation's auc: 0.836436
[1794]	Train's auc: 0.977659	Evaluation's auc: 0.836433
[1795]	Train's auc: 0.977669	Evaluation's auc: 0.836417
[1796]	Train's auc: 0.977676	Evaluation's auc: 0.836415
[1797]	Train's auc: 0.977689	Evaluation's auc: 0.83645
[1798]	Train's auc: 0.977695	Evaluation's auc: 0.836457
[1799]	Train's auc: 0.977714	Evaluation's auc: 0.836451
[1800]	Train's auc: 0.97773	Evaluation's auc: 0.836441
[1801]	Train's auc: 0.977767	Evaluation's auc: 0.836449
[1802]	Train's auc: 0.977787	Evaluation's auc: 0.836463
[1803]	Train's auc: 0.977807	Evaluation's auc: 0.836451
[1804]	Train's auc: 0.977809	Evaluation's auc: 0.836445
[1805]	Train's auc: 0.977819	Evaluation's auc: 0.836466
[1806]	Train's auc: 0.977847	Evaluation's auc: 0.836442
[1807]	Train's auc: 0.977861	Evaluation's auc: 0.83643
[1808]	Train's auc: 0.977868	Evaluation's auc: 0.836444
[1809]	Train's auc: 0.977868	Evaluation's auc: 0.836436
[1810]	Train's auc: 0.977879	Evaluation's auc: 0.836483
[1811]	Train's auc: 0.977891	Evaluation's auc: 0.836457
[1812]	Train's auc: 0.977905	Evaluation's auc: 0.836466
[1813]	Train's auc: 0.977924	Evaluation's auc: 0.836442
[1814]	Train's auc: 0.97793	Evaluation's auc: 0.836417
[1815]	Train's auc: 0.977943	Evaluation's auc: 0.836475
[1816]	Train's auc: 0.977968	Evaluation's auc: 0.836447
[1817]	Train's auc: 0.977978	Evaluation's auc: 0.836458
[1818]	Train's auc: 0.977993	Evaluation's auc: 0.836466
[1819]	Train's auc: 0.978017	Evaluation's auc: 0.83645
[1820]	Train's auc: 0.978027	Evaluation's auc: 0.836467
[1821]	Train's auc: 0.978049	Evaluation's auc: 0.836451
[1822]	Train's auc: 0.978053	Evaluation's auc: 0.836447
[1823]	Train's auc: 0.97806	Evaluation's auc: 0.836473
[1824]	Train's auc: 0.978077	Evaluation's auc: 0.83646
[1825]	Train's auc: 0.978096	Evaluation's auc: 0.836464
[1826]	Train's auc: 0.978112	Evaluation's auc: 0.836455
[1827]	Train's auc: 0.978123	Evaluation's auc: 0.836471
[1828]	Train's auc: 0.978122	Evaluation's auc: 0.83649
[1829]	Train's auc: 0.97813	Evaluation's auc: 0.836514
[1830]	Train's auc: 0.978146	Evaluation's auc: 0.836447
[1831]	Train's auc: 0.978165	Evaluation's auc: 0.836426
[1832]	Train's auc: 0.978177	Evaluation's auc: 0.836427
[1833]	Train's auc: 0.978194	Evaluation's auc: 0.836447
[1834]	Train's auc: 0.978208	Evaluation's auc: 0.836425
[1835]	Train's auc: 0.978217	Evaluation's auc: 0.83644
[1836]	Train's auc: 0.978233	Evaluation's auc: 0.836414
[1837]	Train's auc: 0.97825	Evaluation's auc: 0.836411
[1838]	Train's auc: 0.978265	Evaluation's auc: 0.836394
[1839]	Train's auc: 0.978282	Evaluation's auc: 0.836379
[1840]	Train's auc: 0.978293	Evaluation's auc: 0.836377
[1841]	Train's auc: 0.978305	Evaluation's auc: 0.836374
[1842]	Train's auc: 0.978326	Evaluation's auc: 0.836376
[1843]	Train's auc: 0.97834	Evaluation's auc: 0.836346
[1844]	Train's auc: 0.978351	Evaluation's auc: 0.836343
[1845]	Train's auc: 0.978352	Evaluation's auc: 0.83634
[1846]	Train's auc: 0.97837	Evaluation's auc: 0.836321
[1847]	Train's auc: 0.978385	Evaluation's auc: 0.836321
[1848]	Train's auc: 0.978395	Evaluation's auc: 0.83633
[1849]	Train's auc: 0.978408	Evaluation's auc: 0.836317
[1850]	Train's auc: 0.978425	Evaluation's auc: 0.83634
[1851]	Train's auc: 0.978436	Evaluation's auc: 0.836356
[1852]	Train's auc: 0.978447	Evaluation's auc: 0.836344
[1853]	Train's auc: 0.978467	Evaluation's auc: 0.8363
[1854]	Train's auc: 0.978474	Evaluation's auc: 0.836281
[1855]	Train's auc: 0.978481	Evaluation's auc: 0.836293
[1856]	Train's auc: 0.978504	Evaluation's auc: 0.836308
[1857]	Train's auc: 0.978509	Evaluation's auc: 0.836315
[1858]	Train's auc: 0.978517	Evaluation's auc: 0.836338
[1859]	Train's auc: 0.978534	Evaluation's auc: 0.836311
[1860]	Train's auc: 0.978549	Evaluation's auc: 0.836337
[1861]	Train's auc: 0.978559	Evaluation's auc: 0.836311
[1862]	Train's auc: 0.978567	Evaluation's auc: 0.836309
[1863]	Train's auc: 0.978591	Evaluation's auc: 0.836311
[1864]	Train's auc: 0.978605	Evaluation's auc: 0.83632
[1865]	Train's auc: 0.97862	Evaluation's auc: 0.836324
[1866]	Train's auc: 0.978634	Evaluation's auc: 0.836276
[1867]	Train's auc: 0.978639	Evaluation's auc: 0.836277
[1868]	Train's auc: 0.97865	Evaluation's auc: 0.836255
[1869]	Train's auc: 0.978674	Evaluation's auc: 0.836269
[1870]	Train's auc: 0.978707	Evaluation's auc: 0.836257
[1871]	Train's auc: 0.978717	Evaluation's auc: 0.836247
[1872]	Train's auc: 0.978733	Evaluation's auc: 0.836271
[1873]	Train's auc: 0.978734	Evaluation's auc: 0.836289
[1874]	Train's auc: 0.978751	Evaluation's auc: 0.836269
[1875]	Train's auc: 0.978767	Evaluation's auc: 0.836301
[1876]	Train's auc: 0.978782	Evaluation's auc: 0.836315
[1877]	Train's auc: 0.978796	Evaluation's auc: 0.836297
[1878]	Train's auc: 0.978808	Evaluation's auc: 0.83629
[1879]	Train's auc: 0.978836	Evaluation's auc: 0.836315
[1880]	Train's auc: 0.978854	Evaluation's auc: 0.836295
[1881]	Train's auc: 0.978873	Evaluation's auc: 0.836344
[1882]	Train's auc: 0.978897	Evaluation's auc: 0.836351
[1883]	Train's auc: 0.978917	Evaluation's auc: 0.836375
[1884]	Train's auc: 0.978925	Evaluation's auc: 0.836381
[1885]	Train's auc: 0.97893	Evaluation's auc: 0.836368
[1886]	Train's auc: 0.978943	Evaluation's auc: 0.836333
[1887]	Train's auc: 0.978954	Evaluation's auc: 0.836328
[1888]	Train's auc: 0.978941	Evaluation's auc: 0.836365
[1889]	Train's auc: 0.978943	Evaluation's auc: 0.836391
[1890]	Train's auc: 0.978954	Evaluation's auc: 0.836379
[1891]	Train's auc: 0.978972	Evaluation's auc: 0.836357
[1892]	Train's auc: 0.978986	Evaluation's auc: 0.836393
[1893]	Train's auc: 0.979006	Evaluation's auc: 0.836389
[1894]	Train's auc: 0.979012	Evaluation's auc: 0.836371
[1895]	Train's auc: 0.979029	Evaluation's auc: 0.836381
[1896]	Train's auc: 0.979043	Evaluation's auc: 0.836362
[1897]	Train's auc: 0.979049	Evaluation's auc: 0.836374
[1898]	Train's auc: 0.97907	Evaluation's auc: 0.836382
[1899]	Train's auc: 0.979076	Evaluation's auc: 0.836394
[1900]	Train's auc: 0.979083	Evaluation's auc: 0.836394
[1901]	Train's auc: 0.979095	Evaluation's auc: 0.836389
[1902]	Train's auc: 0.979105	Evaluation's auc: 0.836371
[1903]	Train's auc: 0.979119	Evaluation's auc: 0.836327
[1904]	Train's auc: 0.979137	Evaluation's auc: 0.836324
[1905]	Train's auc: 0.97915	Evaluation's auc: 0.836346
[1906]	Train's auc: 0.979172	Evaluation's auc: 0.836371
[1907]	Train's auc: 0.97918	Evaluation's auc: 0.836352
[1908]	Train's auc: 0.979182	Evaluation's auc: 0.836351
[1909]	Train's auc: 0.979193	Evaluation's auc: 0.836327
[1910]	Train's auc: 0.979211	Evaluation's auc: 0.836283
[1911]	Train's auc: 0.979227	Evaluation's auc: 0.836275
[1912]	Train's auc: 0.979243	Evaluation's auc: 0.836262
[1913]	Train's auc: 0.979262	Evaluation's auc: 0.836237
[1914]	Train's auc: 0.979267	Evaluation's auc: 0.836243
[1915]	Train's auc: 0.979283	Evaluation's auc: 0.836218
[1916]	Train's auc: 0.979272	Evaluation's auc: 0.836286
[1917]	Train's auc: 0.979293	Evaluation's auc: 0.836255
[1918]	Train's auc: 0.979302	Evaluation's auc: 0.836258
[1919]	Train's auc: 0.979307	Evaluation's auc: 0.83623
[1920]	Train's auc: 0.979318	Evaluation's auc: 0.836238
[1921]	Train's auc: 0.97932	Evaluation's auc: 0.836233
[1922]	Train's auc: 0.979335	Evaluation's auc: 0.836247
[1923]	Train's auc: 0.979345	Evaluation's auc: 0.836254
[1924]	Train's auc: 0.97936	Evaluation's auc: 0.836249
[1925]	Train's auc: 0.979363	Evaluation's auc: 0.836243
[1926]	Train's auc: 0.979369	Evaluation's auc: 0.836226
[1927]	Train's auc: 0.979376	Evaluation's auc: 0.836226
[1928]	Train's auc: 0.979385	Evaluation's auc: 0.836211
[1929]	Train's auc: 0.979387	Evaluation's auc: 0.836221
[1930]	Train's auc: 0.979405	Evaluation's auc: 0.836203
[1931]	Train's auc: 0.979414	Evaluation's auc: 0.836225
[1932]	Train's auc: 0.979422	Evaluation's auc: 0.836223
[1933]	Train's auc: 0.97943	Evaluation's auc: 0.83622
[1934]	Train's auc: 0.979435	Evaluation's auc: 0.836225
[1935]	Train's auc: 0.979435	Evaluation's auc: 0.836224
[1936]	Train's auc: 0.979439	Evaluation's auc: 0.836213
[1937]	Train's auc: 0.979442	Evaluation's auc: 0.836208
[1938]	Train's auc: 0.979455	Evaluation's auc: 0.836198
[1939]	Train's auc: 0.979472	Evaluation's auc: 0.83618
[1940]	Train's auc: 0.979495	Evaluation's auc: 0.836137
[1941]	Train's auc: 0.979499	Evaluation's auc: 0.836137
[1942]	Train's auc: 0.979514	Evaluation's auc: 0.83616
[1943]	Train's auc: 0.979526	Evaluation's auc: 0.836173
[1944]	Train's auc: 0.979543	Evaluation's auc: 0.836196
[1945]	Train's auc: 0.979554	Evaluation's auc: 0.836182
[1946]	Train's auc: 0.979554	Evaluation's auc: 0.836157
[1947]	Train's auc: 0.979556	Evaluation's auc: 0.836153
[1948]	Train's auc: 0.979568	Evaluation's auc: 0.836168
[1949]	Train's auc: 0.979574	Evaluation's auc: 0.836162
[1950]	Train's auc: 0.979579	Evaluation's auc: 0.836166
[1951]	Train's auc: 0.979585	Evaluation's auc: 0.836167
[1952]	Train's auc: 0.979598	Evaluation's auc: 0.836141
[1953]	Train's auc: 0.979598	Evaluation's auc: 0.836127
[1954]	Train's auc: 0.979616	Evaluation's auc: 0.83609
[1955]	Train's auc: 0.979623	Evaluation's auc: 0.836065
[1956]	Train's auc: 0.979637	Evaluation's auc: 0.836037
[1957]	Train's auc: 0.979654	Evaluation's auc: 0.836013
[1958]	Train's auc: 0.979668	Evaluation's auc: 0.836008
[1959]	Train's auc: 0.979678	Evaluation's auc: 0.836006
[1960]	Train's auc: 0.979678	Evaluation's auc: 0.836009
[1961]	Train's auc: 0.979683	Evaluation's auc: 0.835992
[1962]	Train's auc: 0.979685	Evaluation's auc: 0.836021
[1963]	Train's auc: 0.979691	Evaluation's auc: 0.836021
[1964]	Train's auc: 0.979711	Evaluation's auc: 0.835998
[1965]	Train's auc: 0.979722	Evaluation's auc: 0.836027
[1966]	Train's auc: 0.979736	Evaluation's auc: 0.835995
[1967]	Train's auc: 0.979752	Evaluation's auc: 0.835998
[1968]	Train's auc: 0.979767	Evaluation's auc: 0.836001
[1969]	Train's auc: 0.979779	Evaluation's auc: 0.83596
[1970]	Train's auc: 0.979794	Evaluation's auc: 0.835958
[1971]	Train's auc: 0.979813	Evaluation's auc: 0.835963
[1972]	Train's auc: 0.979833	Evaluation's auc: 0.835969
[1973]	Train's auc: 0.979865	Evaluation's auc: 0.835993
[1974]	Train's auc: 0.979861	Evaluation's auc: 0.835998
[1975]	Train's auc: 0.979874	Evaluation's auc: 0.836011
[1976]	Train's auc: 0.97987	Evaluation's auc: 0.836009
[1977]	Train's auc: 0.979876	Evaluation's auc: 0.835983
[1978]	Train's auc: 0.979874	Evaluation's auc: 0.836029
[1979]	Train's auc: 0.979885	Evaluation's auc: 0.836022
[1980]	Train's auc: 0.979886	Evaluation's auc: 0.836006
[1981]	Train's auc: 0.9799	Evaluation's auc: 0.836014
[1982]	Train's auc: 0.979942	Evaluation's auc: 0.836064
[1983]	Train's auc: 0.979956	Evaluation's auc: 0.836022
[1984]	Train's auc: 0.97998	Evaluation's auc: 0.835996
[1985]	Train's auc: 0.979983	Evaluation's auc: 0.835982
[1986]	Train's auc: 0.979984	Evaluation's auc: 0.835984
[1987]	Train's auc: 0.979992	Evaluation's auc: 0.835983
[1988]	Train's auc: 0.98	Evaluation's auc: 0.835977
[1989]	Train's auc: 0.980027	Evaluation's auc: 0.835974
[1990]	Train's auc: 0.980023	Evaluation's auc: 0.835977
[1991]	Train's auc: 0.98003	Evaluation's auc: 0.836004
[1992]	Train's auc: 0.980037	Evaluation's auc: 0.835987
[1993]	Train's auc: 0.98005	Evaluation's auc: 0.835999
[1994]	Train's auc: 0.98005	Evaluation's auc: 0.835987
[1995]	Train's auc: 0.980064	Evaluation's auc: 0.835964
[1996]	Train's auc: 0.980073	Evaluation's auc: 0.836006
[1997]	Train's auc: 0.980079	Evaluation's auc: 0.835985
[1998]	Train's auc: 0.980086	Evaluation's auc: 0.836004
[1999]	Train's auc: 0.980099	Evaluation's auc: 0.835991
[2000]	Train's auc: 0.980113	Evaluation's auc: 0.835976
[2001]	Train's auc: 0.980127	Evaluation's auc: 0.835974
[2002]	Train's auc: 0.980137	Evaluation's auc: 0.835949
[2003]	Train's auc: 0.980143	Evaluation's auc: 0.835962
[2004]	Train's auc: 0.980166	Evaluation's auc: 0.835926
[2005]	Train's auc: 0.980165	Evaluation's auc: 0.835968
[2006]	Train's auc: 0.980172	Evaluation's auc: 0.835985
[2007]	Train's auc: 0.980181	Evaluation's auc: 0.835949
[2008]	Train's auc: 0.980195	Evaluation's auc: 0.83593
[2009]	Train's auc: 0.9802	Evaluation's auc: 0.835938
[2010]	Train's auc: 0.980206	Evaluation's auc: 0.835933
[2011]	Train's auc: 0.980214	Evaluation's auc: 0.835942
[2012]	Train's auc: 0.980217	Evaluation's auc: 0.835942
[2013]	Train's auc: 0.980217	Evaluation's auc: 0.835936
[2014]	Train's auc: 0.980237	Evaluation's auc: 0.835925
[2015]	Train's auc: 0.980245	Evaluation's auc: 0.835912
[2016]	Train's auc: 0.98026	Evaluation's auc: 0.835918
[2017]	Train's auc: 0.980262	Evaluation's auc: 0.835917
[2018]	Train's auc: 0.980274	Evaluation's auc: 0.83593
[2019]	Train's auc: 0.980288	Evaluation's auc: 0.835936
[2020]	Train's auc: 0.980307	Evaluation's auc: 0.835954
[2021]	Train's auc: 0.980314	Evaluation's auc: 0.835945
[2022]	Train's auc: 0.980319	Evaluation's auc: 0.83595
[2023]	Train's auc: 0.980318	Evaluation's auc: 0.835974
[2024]	Train's auc: 0.98032	Evaluation's auc: 0.835992
[2025]	Train's auc: 0.980313	Evaluation's auc: 0.836003
[2026]	Train's auc: 0.980332	Evaluation's auc: 0.835991
[2027]	Train's auc: 0.98034	Evaluation's auc: 0.835989
[2028]	Train's auc: 0.980354	Evaluation's auc: 0.835996
[2029]	Train's auc: 0.980359	Evaluation's auc: 0.835976
[2030]	Train's auc: 0.980384	Evaluation's auc: 0.835963
[2031]	Train's auc: 0.980401	Evaluation's auc: 0.835951
[2032]	Train's auc: 0.980397	Evaluation's auc: 0.835962
[2033]	Train's auc: 0.980417	Evaluation's auc: 0.835944
[2034]	Train's auc: 0.980446	Evaluation's auc: 0.835925
[2035]	Train's auc: 0.980449	Evaluation's auc: 0.835952
[2036]	Train's auc: 0.980461	Evaluation's auc: 0.835926
[2037]	Train's auc: 0.980473	Evaluation's auc: 0.83594
[2038]	Train's auc: 0.980476	Evaluation's auc: 0.835951
[2039]	Train's auc: 0.98049	Evaluation's auc: 0.835966
[2040]	Train's auc: 0.980506	Evaluation's auc: 0.835969
[2041]	Train's auc: 0.980511	Evaluation's auc: 0.83598
[2042]	Train's auc: 0.980526	Evaluation's auc: 0.83595
[2043]	Train's auc: 0.980529	Evaluation's auc: 0.835943
[2044]	Train's auc: 0.980537	Evaluation's auc: 0.835945
[2045]	Train's auc: 0.980548	Evaluation's auc: 0.835937
[2046]	Train's auc: 0.980565	Evaluation's auc: 0.83597
[2047]	Train's auc: 0.980567	Evaluation's auc: 0.835961
[2048]	Train's auc: 0.980596	Evaluation's auc: 0.835973
[2049]	Train's auc: 0.98062	Evaluation's auc: 0.835957
[2050]	Train's auc: 0.980621	Evaluation's auc: 0.83596
[2051]	Train's auc: 0.980649	Evaluation's auc: 0.835975
[2052]	Train's auc: 0.980665	Evaluation's auc: 0.835969
[2053]	Train's auc: 0.980677	Evaluation's auc: 0.835986
[2054]	Train's auc: 0.980685	Evaluation's auc: 0.835992
[2055]	Train's auc: 0.980702	Evaluation's auc: 0.836002
[2056]	Train's auc: 0.980721	Evaluation's auc: 0.836022
[2057]	Train's auc: 0.980723	Evaluation's auc: 0.836028
[2058]	Train's auc: 0.980728	Evaluation's auc: 0.836018
[2059]	Train's auc: 0.980743	Evaluation's auc: 0.835969
[2060]	Train's auc: 0.980749	Evaluation's auc: 0.836004
[2061]	Train's auc: 0.980752	Evaluation's auc: 0.835996
[2062]	Train's auc: 0.980752	Evaluation's auc: 0.835991
[2063]	Train's auc: 0.980771	Evaluation's auc: 0.835957
[2064]	Train's auc: 0.980786	Evaluation's auc: 0.835927
[2065]	Train's auc: 0.980806	Evaluation's auc: 0.835918
[2066]	Train's auc: 0.980824	Evaluation's auc: 0.835887
[2067]	Train's auc: 0.980841	Evaluation's auc: 0.835896
[2068]	Train's auc: 0.980842	Evaluation's auc: 0.835889
[2069]	Train's auc: 0.980849	Evaluation's auc: 0.835874
[2070]	Train's auc: 0.980847	Evaluation's auc: 0.835885
[2071]	Train's auc: 0.980849	Evaluation's auc: 0.835884
[2072]	Train's auc: 0.98087	Evaluation's auc: 0.835854
[2073]	Train's auc: 0.980871	Evaluation's auc: 0.835843
[2074]	Train's auc: 0.980874	Evaluation's auc: 0.835849
[2075]	Train's auc: 0.980891	Evaluation's auc: 0.835846
[2076]	Train's auc: 0.980911	Evaluation's auc: 0.83585
[2077]	Train's auc: 0.980921	Evaluation's auc: 0.835839
[2078]	Train's auc: 0.980933	Evaluation's auc: 0.835842
[2079]	Train's auc: 0.980953	Evaluation's auc: 0.83582
[2080]	Train's auc: 0.980965	Evaluation's auc: 0.835806
[2081]	Train's auc: 0.980969	Evaluation's auc: 0.83581
[2082]	Train's auc: 0.980983	Evaluation's auc: 0.835808
[2083]	Train's auc: 0.980986	Evaluation's auc: 0.835807
[2084]	Train's auc: 0.980996	Evaluation's auc: 0.835798
[2085]	Train's auc: 0.981018	Evaluation's auc: 0.835769
[2086]	Train's auc: 0.981015	Evaluation's auc: 0.835769
[2087]	Train's auc: 0.981028	Evaluation's auc: 0.835761
[2088]	Train's auc: 0.981039	Evaluation's auc: 0.835774
[2089]	Train's auc: 0.981057	Evaluation's auc: 0.835787
[2090]	Train's auc: 0.981076	Evaluation's auc: 0.83577
[2091]	Train's auc: 0.981084	Evaluation's auc: 0.83577
[2092]	Train's auc: 0.981092	Evaluation's auc: 0.835741
[2093]	Train's auc: 0.981101	Evaluation's auc: 0.835736
[2094]	Train's auc: 0.981112	Evaluation's auc: 0.835721
[2095]	Train's auc: 0.981121	Evaluation's auc: 0.835722
[2096]	Train's auc: 0.981131	Evaluation's auc: 0.835695
[2097]	Train's auc: 0.981128	Evaluation's auc: 0.835689
[2098]	Train's auc: 0.981139	Evaluation's auc: 0.835685
[2099]	Train's auc: 0.981138	Evaluation's auc: 0.835686
[2100]	Train's auc: 0.981152	Evaluation's auc: 0.835651
[2101]	Train's auc: 0.981157	Evaluation's auc: 0.835629
[2102]	Train's auc: 0.981163	Evaluation's auc: 0.835649
[2103]	Train's auc: 0.981177	Evaluation's auc: 0.835646
[2104]	Train's auc: 0.981182	Evaluation's auc: 0.835636
[2105]	Train's auc: 0.981191	Evaluation's auc: 0.835626
[2106]	Train's auc: 0.981211	Evaluation's auc: 0.835617
[2107]	Train's auc: 0.981227	Evaluation's auc: 0.835631
[2108]	Train's auc: 0.981246	Evaluation's auc: 0.835614
[2109]	Train's auc: 0.981256	Evaluation's auc: 0.835611
[2110]	Train's auc: 0.981259	Evaluation's auc: 0.83565
[2111]	Train's auc: 0.98126	Evaluation's auc: 0.83565
[2112]	Train's auc: 0.981299	Evaluation's auc: 0.835659
[2113]	Train's auc: 0.981306	Evaluation's auc: 0.835622
[2114]	Train's auc: 0.981318	Evaluation's auc: 0.835627
[2115]	Train's auc: 0.981324	Evaluation's auc: 0.835627
[2116]	Train's auc: 0.981337	Evaluation's auc: 0.835629
[2117]	Train's auc: 0.98135	Evaluation's auc: 0.835579
[2118]	Train's auc: 0.981355	Evaluation's auc: 0.835581
[2119]	Train's auc: 0.981372	Evaluation's auc: 0.8356
[2120]	Train's auc: 0.98138	Evaluation's auc: 0.835592
[2121]	Train's auc: 0.98139	Evaluation's auc: 0.835604
[2122]	Train's auc: 0.981435	Evaluation's auc: 0.835637
[2123]	Train's auc: 0.981443	Evaluation's auc: 0.835614
[2124]	Train's auc: 0.981457	Evaluation's auc: 0.835614
[2125]	Train's auc: 0.981473	Evaluation's auc: 0.835619
[2126]	Train's auc: 0.981471	Evaluation's auc: 0.835657
[2127]	Train's auc: 0.98148	Evaluation's auc: 0.835676
[2128]	Train's auc: 0.981498	Evaluation's auc: 0.835647
[2129]	Train's auc: 0.981508	Evaluation's auc: 0.835633
[2130]	Train's auc: 0.981518	Evaluation's auc: 0.83565
[2131]	Train's auc: 0.981521	Evaluation's auc: 0.835657
[2132]	Train's auc: 0.981542	Evaluation's auc: 0.835655
[2133]	Train's auc: 0.981551	Evaluation's auc: 0.835657
[2134]	Train's auc: 0.981549	Evaluation's auc: 0.835648
[2135]	Train's auc: 0.981571	Evaluation's auc: 0.8356
[2136]	Train's auc: 0.981577	Evaluation's auc: 0.835619
[2137]	Train's auc: 0.98157	Evaluation's auc: 0.83563
[2138]	Train's auc: 0.981581	Evaluation's auc: 0.835612
[2139]	Train's auc: 0.981596	Evaluation's auc: 0.835618
[2140]	Train's auc: 0.981598	Evaluation's auc: 0.835619
[2141]	Train's auc: 0.98161	Evaluation's auc: 0.835642
[2142]	Train's auc: 0.98163	Evaluation's auc: 0.835621
[2143]	Train's auc: 0.981644	Evaluation's auc: 0.835628
[2144]	Train's auc: 0.981657	Evaluation's auc: 0.835627
[2145]	Train's auc: 0.981669	Evaluation's auc: 0.83563
[2146]	Train's auc: 0.981679	Evaluation's auc: 0.835656
[2147]	Train's auc: 0.981687	Evaluation's auc: 0.835646
[2148]	Train's auc: 0.98169	Evaluation's auc: 0.835641
[2149]	Train's auc: 0.981692	Evaluation's auc: 0.835639
[2150]	Train's auc: 0.981701	Evaluation's auc: 0.835611
[2151]	Train's auc: 0.981711	Evaluation's auc: 0.83559
[2152]	Train's auc: 0.981723	Evaluation's auc: 0.835577
[2153]	Train's auc: 0.981724	Evaluation's auc: 0.835578
[2154]	Train's auc: 0.98173	Evaluation's auc: 0.835576
[2155]	Train's auc: 0.98174	Evaluation's auc: 0.835595
[2156]	Train's auc: 0.981743	Evaluation's auc: 0.835617
[2157]	Train's auc: 0.981757	Evaluation's auc: 0.835612
[2158]	Train's auc: 0.981782	Evaluation's auc: 0.835611
[2159]	Train's auc: 0.981795	Evaluation's auc: 0.83561
[2160]	Train's auc: 0.981806	Evaluation's auc: 0.835603
[2161]	Train's auc: 0.981812	Evaluation's auc: 0.835603
[2162]	Train's auc: 0.981823	Evaluation's auc: 0.835595
[2163]	Train's auc: 0.981832	Evaluation's auc: 0.835566
[2164]	Train's auc: 0.981833	Evaluation's auc: 0.835561
[2165]	Train's auc: 0.981848	Evaluation's auc: 0.835542
[2166]	Train's auc: 0.981862	Evaluation's auc: 0.835496
[2167]	Train's auc: 0.98187	Evaluation's auc: 0.835473
[2168]	Train's auc: 0.981873	Evaluation's auc: 0.835475
[2169]	Train's auc: 0.981869	Evaluation's auc: 0.8355
[2170]	Train's auc: 0.981882	Evaluation's auc: 0.835496
[2171]	Train's auc: 0.981886	Evaluation's auc: 0.835501
[2172]	Train's auc: 0.981898	Evaluation's auc: 0.83548
[2173]	Train's auc: 0.981911	Evaluation's auc: 0.83543
[2174]	Train's auc: 0.981925	Evaluation's auc: 0.835451
[2175]	Train's auc: 0.981934	Evaluation's auc: 0.835441
[2176]	Train's auc: 0.981941	Evaluation's auc: 0.835416
[2177]	Train's auc: 0.981953	Evaluation's auc: 0.835384
[2178]	Train's auc: 0.981965	Evaluation's auc: 0.835368
[2179]	Train's auc: 0.981988	Evaluation's auc: 0.835381
[2180]	Train's auc: 0.981992	Evaluation's auc: 0.835389
[2181]	Train's auc: 0.981997	Evaluation's auc: 0.835377
[2182]	Train's auc: 0.982008	Evaluation's auc: 0.835395
[2183]	Train's auc: 0.982009	Evaluation's auc: 0.835395
[2184]	Train's auc: 0.982012	Evaluation's auc: 0.835412
[2185]	Train's auc: 0.982012	Evaluation's auc: 0.835425
[2186]	Train's auc: 0.982019	Evaluation's auc: 0.835394
[2187]	Train's auc: 0.982033	Evaluation's auc: 0.835371
[2188]	Train's auc: 0.982042	Evaluation's auc: 0.835352
[2189]	Train's auc: 0.982057	Evaluation's auc: 0.835321
[2190]	Train's auc: 0.982068	Evaluation's auc: 0.83531
[2191]	Train's auc: 0.98208	Evaluation's auc: 0.835291
[2192]	Train's auc: 0.982088	Evaluation's auc: 0.835296
[2193]	Train's auc: 0.982105	Evaluation's auc: 0.835298
[2194]	Train's auc: 0.982114	Evaluation's auc: 0.835284
[2195]	Train's auc: 0.982126	Evaluation's auc: 0.835257
[2196]	Train's auc: 0.982128	Evaluation's auc: 0.835272
[2197]	Train's auc: 0.982134	Evaluation's auc: 0.835271
[2198]	Train's auc: 0.982133	Evaluation's auc: 0.835273
[2199]	Train's auc: 0.982134	Evaluation's auc: 0.835279
[2200]	Train's auc: 0.982138	Evaluation's auc: 0.835337
[2201]	Train's auc: 0.982166	Evaluation's auc: 0.835341
[2202]	Train's auc: 0.982178	Evaluation's auc: 0.835333
[2203]	Train's auc: 0.982186	Evaluation's auc: 0.835323
[2204]	Train's auc: 0.982199	Evaluation's auc: 0.835278
[2205]	Train's auc: 0.9822	Evaluation's auc: 0.835276
[2206]	Train's auc: 0.982208	Evaluation's auc: 0.835296
[2207]	Train's auc: 0.982201	Evaluation's auc: 0.835289
[2208]	Train's auc: 0.982213	Evaluation's auc: 0.835293
[2209]	Train's auc: 0.982231	Evaluation's auc: 0.835254
[2210]	Train's auc: 0.982244	Evaluation's auc: 0.835268
[2211]	Train's auc: 0.982242	Evaluation's auc: 0.83525
[2212]	Train's auc: 0.982261	Evaluation's auc: 0.835242
[2213]	Train's auc: 0.982267	Evaluation's auc: 0.835234
[2214]	Train's auc: 0.98227	Evaluation's auc: 0.835239
[2215]	Train's auc: 0.982277	Evaluation's auc: 0.835213
[2216]	Train's auc: 0.982288	Evaluation's auc: 0.835181
[2217]	Train's auc: 0.982291	Evaluation's auc: 0.835174
[2218]	Train's auc: 0.982298	Evaluation's auc: 0.835185
[2219]	Train's auc: 0.982295	Evaluation's auc: 0.835189
[2220]	Train's auc: 0.982298	Evaluation's auc: 0.835187
[2221]	Train's auc: 0.982307	Evaluation's auc: 0.835201
[2222]	Train's auc: 0.982308	Evaluation's auc: 0.8352
[2223]	Train's auc: 0.982318	Evaluation's auc: 0.835169
[2224]	Train's auc: 0.982328	Evaluation's auc: 0.835152
[2225]	Train's auc: 0.982317	Evaluation's auc: 0.835145
[2226]	Train's auc: 0.982328	Evaluation's auc: 0.835143
[2227]	Train's auc: 0.98234	Evaluation's auc: 0.835131
[2228]	Train's auc: 0.982348	Evaluation's auc: 0.835122
[2229]	Train's auc: 0.982357	Evaluation's auc: 0.835099
[2230]	Train's auc: 0.982359	Evaluation's auc: 0.835094
[2231]	Train's auc: 0.98236	Evaluation's auc: 0.835087
[2232]	Train's auc: 0.98236	Evaluation's auc: 0.835088
[2233]	Train's auc: 0.982366	Evaluation's auc: 0.835068
[2234]	Train's auc: 0.982388	Evaluation's auc: 0.83502
[2235]	Train's auc: 0.982406	Evaluation's auc: 0.834946
[2236]	Train's auc: 0.982422	Evaluation's auc: 0.834942
[2237]	Train's auc: 0.982432	Evaluation's auc: 0.834964
[2238]	Train's auc: 0.982442	Evaluation's auc: 0.834948
[2239]	Train's auc: 0.982467	Evaluation's auc: 0.834933
[2240]	Train's auc: 0.982475	Evaluation's auc: 0.834966
[2241]	Train's auc: 0.982477	Evaluation's auc: 0.834976
[2242]	Train's auc: 0.98249	Evaluation's auc: 0.834994
[2243]	Train's auc: 0.982489	Evaluation's auc: 0.835018
[2244]	Train's auc: 0.982502	Evaluation's auc: 0.834976
[2245]	Train's auc: 0.982506	Evaluation's auc: 0.834967
[2246]	Train's auc: 0.982517	Evaluation's auc: 0.834982
[2247]	Train's auc: 0.982538	Evaluation's auc: 0.834961
[2248]	Train's auc: 0.982537	Evaluation's auc: 0.83497
[2249]	Train's auc: 0.98254	Evaluation's auc: 0.834993
[2250]	Train's auc: 0.982559	Evaluation's auc: 0.834988
[2251]	Train's auc: 0.982572	Evaluation's auc: 0.83499
[2252]	Train's auc: 0.982585	Evaluation's auc: 0.834986
[2253]	Train's auc: 0.98258	Evaluation's auc: 0.834999
[2254]	Train's auc: 0.982584	Evaluation's auc: 0.835025
[2255]	Train's auc: 0.98259	Evaluation's auc: 0.835034
[2256]	Train's auc: 0.982593	Evaluation's auc: 0.835037
[2257]	Train's auc: 0.982604	Evaluation's auc: 0.835005
[2258]	Train's auc: 0.982605	Evaluation's auc: 0.834996
[2259]	Train's auc: 0.982599	Evaluation's auc: 0.835039
[2260]	Train's auc: 0.982606	Evaluation's auc: 0.835038
[2261]	Train's auc: 0.98261	Evaluation's auc: 0.835062
[2262]	Train's auc: 0.982615	Evaluation's auc: 0.835054
[2263]	Train's auc: 0.982627	Evaluation's auc: 0.835056
[2264]	Train's auc: 0.982634	Evaluation's auc: 0.835068
[2265]	Train's auc: 0.982648	Evaluation's auc: 0.83508
[2266]	Train's auc: 0.982662	Evaluation's auc: 0.835078
[2267]	Train's auc: 0.982664	Evaluation's auc: 0.835069
[2268]	Train's auc: 0.982665	Evaluation's auc: 0.835051
[2269]	Train's auc: 0.98268	Evaluation's auc: 0.83506
[2270]	Train's auc: 0.982684	Evaluation's auc: 0.83505
[2271]	Train's auc: 0.982691	Evaluation's auc: 0.835042
[2272]	Train's auc: 0.982699	Evaluation's auc: 0.835042
[2273]	Train's auc: 0.982698	Evaluation's auc: 0.835039
[2274]	Train's auc: 0.982711	Evaluation's auc: 0.835059
[2275]	Train's auc: 0.982713	Evaluation's auc: 0.835063
[2276]	Train's auc: 0.982715	Evaluation's auc: 0.83505
[2277]	Train's auc: 0.982722	Evaluation's auc: 0.835062
[2278]	Train's auc: 0.982734	Evaluation's auc: 0.835056
[2279]	Train's auc: 0.982745	Evaluation's auc: 0.835037
[2280]	Train's auc: 0.982758	Evaluation's auc: 0.835017
[2281]	Train's auc: 0.982759	Evaluation's auc: 0.835006
[2282]	Train's auc: 0.98277	Evaluation's auc: 0.835001
[2283]	Train's auc: 0.982783	Evaluation's auc: 0.835028
[2284]	Train's auc: 0.982802	Evaluation's auc: 0.834986
[2285]	Train's auc: 0.98282	Evaluation's auc: 0.834976
[2286]	Train's auc: 0.982833	Evaluation's auc: 0.834945
[2287]	Train's auc: 0.98285	Evaluation's auc: 0.834934
[2288]	Train's auc: 0.982854	Evaluation's auc: 0.834936
[2289]	Train's auc: 0.982856	Evaluation's auc: 0.834952
[2290]	Train's auc: 0.982859	Evaluation's auc: 0.834948
[2291]	Train's auc: 0.982863	Evaluation's auc: 0.834945
[2292]	Train's auc: 0.982867	Evaluation's auc: 0.834946
[2293]	Train's auc: 0.982884	Evaluation's auc: 0.834915
[2294]	Train's auc: 0.982886	Evaluation's auc: 0.834968
[2295]	Train's auc: 0.982899	Evaluation's auc: 0.834958
[2296]	Train's auc: 0.982907	Evaluation's auc: 0.834967
[2297]	Train's auc: 0.982912	Evaluation's auc: 0.834977
[2298]	Train's auc: 0.982914	Evaluation's auc: 0.834968
[2299]	Train's auc: 0.982916	Evaluation's auc: 0.834953
[2300]	Train's auc: 0.982925	Evaluation's auc: 0.834925
[2301]	Train's auc: 0.982927	Evaluation's auc: 0.834947
[2302]	Train's auc: 0.98293	Evaluation's auc: 0.834958
[2303]	Train's auc: 0.982942	Evaluation's auc: 0.834958
[2304]	Train's auc: 0.98295	Evaluation's auc: 0.834959
[2305]	Train's auc: 0.982964	Evaluation's auc: 0.834944
[2306]	Train's auc: 0.982965	Evaluation's auc: 0.834923
[2307]	Train's auc: 0.982985	Evaluation's auc: 0.834915
[2308]	Train's auc: 0.982999	Evaluation's auc: 0.834924
[2309]	Train's auc: 0.983002	Evaluation's auc: 0.834901
[2310]	Train's auc: 0.983012	Evaluation's auc: 0.834894
[2311]	Train's auc: 0.983021	Evaluation's auc: 0.834886
[2312]	Train's auc: 0.983037	Evaluation's auc: 0.834865
[2313]	Train's auc: 0.983046	Evaluation's auc: 0.834877
[2314]	Train's auc: 0.983057	Evaluation's auc: 0.834861
[2315]	Train's auc: 0.983057	Evaluation's auc: 0.834875
[2316]	Train's auc: 0.983061	Evaluation's auc: 0.834863
[2317]	Train's auc: 0.983076	Evaluation's auc: 0.834852
[2318]	Train's auc: 0.983093	Evaluation's auc: 0.834827
[2319]	Train's auc: 0.983101	Evaluation's auc: 0.834844
[2320]	Train's auc: 0.983116	Evaluation's auc: 0.834864
[2321]	Train's auc: 0.98313	Evaluation's auc: 0.834867
[2322]	Train's auc: 0.983141	Evaluation's auc: 0.834862
[2323]	Train's auc: 0.983151	Evaluation's auc: 0.834863
[2324]	Train's auc: 0.983155	Evaluation's auc: 0.83486
[2325]	Train's auc: 0.983153	Evaluation's auc: 0.834857
[2326]	Train's auc: 0.98316	Evaluation's auc: 0.834847
[2327]	Train's auc: 0.98316	Evaluation's auc: 0.834846
[2328]	Train's auc: 0.983168	Evaluation's auc: 0.834858
[2329]	Train's auc: 0.983173	Evaluation's auc: 0.834837
[2330]	Train's auc: 0.983185	Evaluation's auc: 0.834837
[2331]	Train's auc: 0.983193	Evaluation's auc: 0.834831
[2332]	Train's auc: 0.983215	Evaluation's auc: 0.834833
[2333]	Train's auc: 0.983212	Evaluation's auc: 0.834831
[2334]	Train's auc: 0.983222	Evaluation's auc: 0.834842
[2335]	Train's auc: 0.983226	Evaluation's auc: 0.834829
[2336]	Train's auc: 0.98323	Evaluation's auc: 0.834832
[2337]	Train's auc: 0.98323	Evaluation's auc: 0.834817
[2338]	Train's auc: 0.983242	Evaluation's auc: 0.834824
[2339]	Train's auc: 0.983246	Evaluation's auc: 0.834821
[2340]	Train's auc: 0.983253	Evaluation's auc: 0.834815
[2341]	Train's auc: 0.983254	Evaluation's auc: 0.834814
[2342]	Train's auc: 0.983266	Evaluation's auc: 0.834831
[2343]	Train's auc: 0.983274	Evaluation's auc: 0.834834
[2344]	Train's auc: 0.983273	Evaluation's auc: 0.834835
[2345]	Train's auc: 0.983274	Evaluation's auc: 0.834828
[2346]	Train's auc: 0.983281	Evaluation's auc: 0.83481
[2347]	Train's auc: 0.98329	Evaluation's auc: 0.834789
[2348]	Train's auc: 0.983288	Evaluation's auc: 0.834791
[2349]	Train's auc: 0.983303	Evaluation's auc: 0.834783
[2350]	Train's auc: 0.983301	Evaluation's auc: 0.834784
[2351]	Train's auc: 0.983308	Evaluation's auc: 0.834802
[2352]	Train's auc: 0.983321	Evaluation's auc: 0.834802
[2353]	Train's auc: 0.983332	Evaluation's auc: 0.834797
[2354]	Train's auc: 0.983336	Evaluation's auc: 0.834789
[2355]	Train's auc: 0.983347	Evaluation's auc: 0.834758
[2356]	Train's auc: 0.983351	Evaluation's auc: 0.834753
[2357]	Train's auc: 0.983364	Evaluation's auc: 0.834764
[2358]	Train's auc: 0.983371	Evaluation's auc: 0.834755
[2359]	Train's auc: 0.983379	Evaluation's auc: 0.83476
[2360]	Train's auc: 0.983379	Evaluation's auc: 0.834772
[2361]	Train's auc: 0.983377	Evaluation's auc: 0.83477
[2362]	Train's auc: 0.983378	Evaluation's auc: 0.834768
[2363]	Train's auc: 0.983386	Evaluation's auc: 0.83474
[2364]	Train's auc: 0.983393	Evaluation's auc: 0.834742
[2365]	Train's auc: 0.9834	Evaluation's auc: 0.834766
[2366]	Train's auc: 0.983402	Evaluation's auc: 0.834776
[2367]	Train's auc: 0.983425	Evaluation's auc: 0.834789
[2368]	Train's auc: 0.983432	Evaluation's auc: 0.834797
[2369]	Train's auc: 0.983442	Evaluation's auc: 0.834802
[2370]	Train's auc: 0.983455	Evaluation's auc: 0.834774
[2371]	Train's auc: 0.983455	Evaluation's auc: 0.834777
[2372]	Train's auc: 0.983467	Evaluation's auc: 0.834761
[2373]	Train's auc: 0.983479	Evaluation's auc: 0.834733
[2374]	Train's auc: 0.983479	Evaluation's auc: 0.834722
[2375]	Train's auc: 0.983482	Evaluation's auc: 0.834698
[2376]	Train's auc: 0.98352	Evaluation's auc: 0.834734
[2377]	Train's auc: 0.983535	Evaluation's auc: 0.834755
[2378]	Train's auc: 0.983543	Evaluation's auc: 0.834762
[2379]	Train's auc: 0.983538	Evaluation's auc: 0.834772
[2380]	Train's auc: 0.983543	Evaluation's auc: 0.834763
[2381]	Train's auc: 0.983551	Evaluation's auc: 0.834746
[2382]	Train's auc: 0.983561	Evaluation's auc: 0.834735
[2383]	Train's auc: 0.983566	Evaluation's auc: 0.834693
[2384]	Train's auc: 0.98357	Evaluation's auc: 0.834683
[2385]	Train's auc: 0.983574	Evaluation's auc: 0.834692
[2386]	Train's auc: 0.983586	Evaluation's auc: 0.834694
[2387]	Train's auc: 0.983601	Evaluation's auc: 0.834681
[2388]	Train's auc: 0.983612	Evaluation's auc: 0.834712
[2389]	Train's auc: 0.983616	Evaluation's auc: 0.83469
[2390]	Train's auc: 0.98363	Evaluation's auc: 0.834701
[2391]	Train's auc: 0.983637	Evaluation's auc: 0.834688
[2392]	Train's auc: 0.983625	Evaluation's auc: 0.834708
[2393]	Train's auc: 0.983621	Evaluation's auc: 0.8347
[2394]	Train's auc: 0.98363	Evaluation's auc: 0.834681
[2395]	Train's auc: 0.983642	Evaluation's auc: 0.834681
[2396]	Train's auc: 0.983657	Evaluation's auc: 0.834671
[2397]	Train's auc: 0.983661	Evaluation's auc: 0.834667
[2398]	Train's auc: 0.98367	Evaluation's auc: 0.834633
[2399]	Train's auc: 0.983677	Evaluation's auc: 0.834626
[2400]	Train's auc: 0.98367	Evaluation's auc: 0.834614
[2401]	Train's auc: 0.983683	Evaluation's auc: 0.834626
[2402]	Train's auc: 0.98369	Evaluation's auc: 0.834634
[2403]	Train's auc: 0.983694	Evaluation's auc: 0.834664
[2404]	Train's auc: 0.983704	Evaluation's auc: 0.834663
[2405]	Train's auc: 0.983714	Evaluation's auc: 0.834693
[2406]	Train's auc: 0.983717	Evaluation's auc: 0.834687
[2407]	Train's auc: 0.983722	Evaluation's auc: 0.834675
[2408]	Train's auc: 0.983732	Evaluation's auc: 0.834676
[2409]	Train's auc: 0.983735	Evaluation's auc: 0.834675
[2410]	Train's auc: 0.983743	Evaluation's auc: 0.834662
[2411]	Train's auc: 0.983751	Evaluation's auc: 0.834669
[2412]	Train's auc: 0.983761	Evaluation's auc: 0.834628
[2413]	Train's auc: 0.983771	Evaluation's auc: 0.834636
[2414]	Train's auc: 0.983781	Evaluation's auc: 0.834599
[2415]	Train's auc: 0.983787	Evaluation's auc: 0.834605
[2416]	Train's auc: 0.983789	Evaluation's auc: 0.834611
[2417]	Train's auc: 0.983794	Evaluation's auc: 0.834644
[2418]	Train's auc: 0.983801	Evaluation's auc: 0.834679
[2419]	Train's auc: 0.983806	Evaluation's auc: 0.834698
[2420]	Train's auc: 0.983806	Evaluation's auc: 0.834689
[2421]	Train's auc: 0.983817	Evaluation's auc: 0.834687
[2422]	Train's auc: 0.983821	Evaluation's auc: 0.834681
[2423]	Train's auc: 0.983825	Evaluation's auc: 0.834696
[2424]	Train's auc: 0.983834	Evaluation's auc: 0.834643
[2425]	Train's auc: 0.983826	Evaluation's auc: 0.834643
[2426]	Train's auc: 0.983835	Evaluation's auc: 0.834642
[2427]	Train's auc: 0.983844	Evaluation's auc: 0.834614
[2428]	Train's auc: 0.98386	Evaluation's auc: 0.834594
[2429]	Train's auc: 0.983863	Evaluation's auc: 0.834582
[2430]	Train's auc: 0.98387	Evaluation's auc: 0.834586
[2431]	Train's auc: 0.98388	Evaluation's auc: 0.83461
[2432]	Train's auc: 0.983891	Evaluation's auc: 0.834583
[2433]	Train's auc: 0.98389	Evaluation's auc: 0.834574
[2434]	Train's auc: 0.983896	Evaluation's auc: 0.83456
[2435]	Train's auc: 0.983901	Evaluation's auc: 0.8346
[2436]	Train's auc: 0.983908	Evaluation's auc: 0.834567
[2437]	Train's auc: 0.98393	Evaluation's auc: 0.834546
[2438]	Train's auc: 0.98393	Evaluation's auc: 0.834544
[2439]	Train's auc: 0.98394	Evaluation's auc: 0.83455
[2440]	Train's auc: 0.983956	Evaluation's auc: 0.834537
[2441]	Train's auc: 0.983958	Evaluation's auc: 0.834557
[2442]	Train's auc: 0.983965	Evaluation's auc: 0.834554
[2443]	Train's auc: 0.983973	Evaluation's auc: 0.834527
[2444]	Train's auc: 0.983977	Evaluation's auc: 0.83452
[2445]	Train's auc: 0.983984	Evaluation's auc: 0.834503
[2446]	Train's auc: 0.983988	Evaluation's auc: 0.834498
[2447]	Train's auc: 0.983999	Evaluation's auc: 0.834493
[2448]	Train's auc: 0.984003	Evaluation's auc: 0.834501
[2449]	Train's auc: 0.984013	Evaluation's auc: 0.834498
[2450]	Train's auc: 0.984015	Evaluation's auc: 0.834482
[2451]	Train's auc: 0.984025	Evaluation's auc: 0.834469
[2452]	Train's auc: 0.984035	Evaluation's auc: 0.834486
[2453]	Train's auc: 0.984053	Evaluation's auc: 0.834452
[2454]	Train's auc: 0.984061	Evaluation's auc: 0.834466
[2455]	Train's auc: 0.984061	Evaluation's auc: 0.834489
[2456]	Train's auc: 0.984064	Evaluation's auc: 0.834465
[2457]	Train's auc: 0.984073	Evaluation's auc: 0.834518
[2458]	Train's auc: 0.984079	Evaluation's auc: 0.83456
[2459]	Train's auc: 0.984092	Evaluation's auc: 0.834557
[2460]	Train's auc: 0.984103	Evaluation's auc: 0.834562
[2461]	Train's auc: 0.984105	Evaluation's auc: 0.834558
[2462]	Train's auc: 0.98411	Evaluation's auc: 0.834558
[2463]	Train's auc: 0.984113	Evaluation's auc: 0.834574
[2464]	Train's auc: 0.984122	Evaluation's auc: 0.834578
[2465]	Train's auc: 0.984119	Evaluation's auc: 0.834592
[2466]	Train's auc: 0.984133	Evaluation's auc: 0.834572
[2467]	Train's auc: 0.984139	Evaluation's auc: 0.834563
[2468]	Train's auc: 0.984151	Evaluation's auc: 0.834567
[2469]	Train's auc: 0.984155	Evaluation's auc: 0.834594
[2470]	Train's auc: 0.984162	Evaluation's auc: 0.834594
[2471]	Train's auc: 0.984175	Evaluation's auc: 0.834565
[2472]	Train's auc: 0.984181	Evaluation's auc: 0.83457
[2473]	Train's auc: 0.984189	Evaluation's auc: 0.834556
[2474]	Train's auc: 0.984194	Evaluation's auc: 0.834535
[2475]	Train's auc: 0.984201	Evaluation's auc: 0.834538
[2476]	Train's auc: 0.98421	Evaluation's auc: 0.834535
[2477]	Train's auc: 0.984211	Evaluation's auc: 0.834535
[2478]	Train's auc: 0.984218	Evaluation's auc: 0.834523
[2479]	Train's auc: 0.98423	Evaluation's auc: 0.834493
[2480]	Train's auc: 0.984232	Evaluation's auc: 0.834512
[2481]	Train's auc: 0.984241	Evaluation's auc: 0.834513
[2482]	Train's auc: 0.984245	Evaluation's auc: 0.834502
[2483]	Train's auc: 0.984244	Evaluation's auc: 0.834507
[2484]	Train's auc: 0.984258	Evaluation's auc: 0.834474
[2485]	Train's auc: 0.984261	Evaluation's auc: 0.834502
[2486]	Train's auc: 0.984263	Evaluation's auc: 0.834474
[2487]	Train's auc: 0.984262	Evaluation's auc: 0.834485
[2488]	Train's auc: 0.984271	Evaluation's auc: 0.834496
[2489]	Train's auc: 0.984272	Evaluation's auc: 0.834493
[2490]	Train's auc: 0.984278	Evaluation's auc: 0.834473
[2491]	Train's auc: 0.984282	Evaluation's auc: 0.834462
[2492]	Train's auc: 0.984289	Evaluation's auc: 0.834466
[2493]	Train's auc: 0.984305	Evaluation's auc: 0.834561
[2494]	Train's auc: 0.984309	Evaluation's auc: 0.834569
[2495]	Train's auc: 0.984318	Evaluation's auc: 0.834566
[2496]	Train's auc: 0.984327	Evaluation's auc: 0.834574
[2497]	Train's auc: 0.984333	Evaluation's auc: 0.834548
[2498]	Train's auc: 0.984333	Evaluation's auc: 0.834549
[2499]	Train's auc: 0.984341	Evaluation's auc: 0.834547
[2500]	Train's auc: 0.984348	Evaluation's auc: 0.834527
[2501]	Train's auc: 0.984356	Evaluation's auc: 0.834511
[2502]	Train's auc: 0.984367	Evaluation's auc: 0.834516
[2503]	Train's auc: 0.98437	Evaluation's auc: 0.834502
[2504]	Train's auc: 0.984371	Evaluation's auc: 0.834495
[2505]	Train's auc: 0.984385	Evaluation's auc: 0.834459
[2506]	Train's auc: 0.98439	Evaluation's auc: 0.834463
[2507]	Train's auc: 0.984389	Evaluation's auc: 0.834457
[2508]	Train's auc: 0.98439	Evaluation's auc: 0.834466
[2509]	Train's auc: 0.984402	Evaluation's auc: 0.834474
[2510]	Train's auc: 0.984408	Evaluation's auc: 0.834514
[2511]	Train's auc: 0.984413	Evaluation's auc: 0.834504
[2512]	Train's auc: 0.984418	Evaluation's auc: 0.834508
[2513]	Train's auc: 0.984421	Evaluation's auc: 0.834518
[2514]	Train's auc: 0.984426	Evaluation's auc: 0.834497
[2515]	Train's auc: 0.984433	Evaluation's auc: 0.834489
[2516]	Train's auc: 0.984443	Evaluation's auc: 0.834501
[2517]	Train's auc: 0.984446	Evaluation's auc: 0.834525
[2518]	Train's auc: 0.984454	Evaluation's auc: 0.834497
[2519]	Train's auc: 0.984464	Evaluation's auc: 0.8345
[2520]	Train's auc: 0.984481	Evaluation's auc: 0.834488
[2521]	Train's auc: 0.984487	Evaluation's auc: 0.834489
[2522]	Train's auc: 0.984494	Evaluation's auc: 0.834474
[2523]	Train's auc: 0.984488	Evaluation's auc: 0.834462
[2524]	Train's auc: 0.984495	Evaluation's auc: 0.834458
[2525]	Train's auc: 0.984504	Evaluation's auc: 0.834441
[2526]	Train's auc: 0.984519	Evaluation's auc: 0.834435
[2527]	Train's auc: 0.984528	Evaluation's auc: 0.834449
[2528]	Train's auc: 0.984532	Evaluation's auc: 0.834463
[2529]	Train's auc: 0.984542	Evaluation's auc: 0.834459
[2530]	Train's auc: 0.984545	Evaluation's auc: 0.83444
[2531]	Train's auc: 0.984555	Evaluation's auc: 0.834436
[2532]	Train's auc: 0.984557	Evaluation's auc: 0.834443
[2533]	Train's auc: 0.984563	Evaluation's auc: 0.834446
[2534]	Train's auc: 0.984563	Evaluation's auc: 0.834442
[2535]	Train's auc: 0.984578	Evaluation's auc: 0.834433
[2536]	Train's auc: 0.984584	Evaluation's auc: 0.834433
[2537]	Train's auc: 0.984591	Evaluation's auc: 0.834411
[2538]	Train's auc: 0.984597	Evaluation's auc: 0.834429
[2539]	Train's auc: 0.984598	Evaluation's auc: 0.834424
[2540]	Train's auc: 0.984607	Evaluation's auc: 0.834396
[2541]	Train's auc: 0.984608	Evaluation's auc: 0.834374
[2542]	Train's auc: 0.98462	Evaluation's auc: 0.834362
[2543]	Train's auc: 0.98465	Evaluation's auc: 0.834382
[2544]	Train's auc: 0.984656	Evaluation's auc: 0.834388
[2545]	Train's auc: 0.984669	Evaluation's auc: 0.834399
[2546]	Train's auc: 0.984675	Evaluation's auc: 0.8344
[2547]	Train's auc: 0.984687	Evaluation's auc: 0.834409
[2548]	Train's auc: 0.984688	Evaluation's auc: 0.834437
[2549]	Train's auc: 0.984674	Evaluation's auc: 0.834459
[2550]	Train's auc: 0.984682	Evaluation's auc: 0.834467
[2551]	Train's auc: 0.98469	Evaluation's auc: 0.834468
[2552]	Train's auc: 0.984697	Evaluation's auc: 0.834461
[2553]	Train's auc: 0.984701	Evaluation's auc: 0.834458
[2554]	Train's auc: 0.984718	Evaluation's auc: 0.834444
[2555]	Train's auc: 0.984735	Evaluation's auc: 0.83443
[2556]	Train's auc: 0.984737	Evaluation's auc: 0.834431
[2557]	Train's auc: 0.984752	Evaluation's auc: 0.834442
[2558]	Train's auc: 0.984755	Evaluation's auc: 0.834431
[2559]	Train's auc: 0.984771	Evaluation's auc: 0.834428
[2560]	Train's auc: 0.984781	Evaluation's auc: 0.834434
[2561]	Train's auc: 0.98478	Evaluation's auc: 0.834437
[2562]	Train's auc: 0.984794	Evaluation's auc: 0.834433
[2563]	Train's auc: 0.9848	Evaluation's auc: 0.834428
[2564]	Train's auc: 0.984804	Evaluation's auc: 0.834419
[2565]	Train's auc: 0.984799	Evaluation's auc: 0.834422
[2566]	Train's auc: 0.984812	Evaluation's auc: 0.834398
[2567]	Train's auc: 0.984824	Evaluation's auc: 0.834411
[2568]	Train's auc: 0.98483	Evaluation's auc: 0.834436
[2569]	Train's auc: 0.984832	Evaluation's auc: 0.834414
[2570]	Train's auc: 0.984843	Evaluation's auc: 0.834436
[2571]	Train's auc: 0.984849	Evaluation's auc: 0.834434
[2572]	Train's auc: 0.984851	Evaluation's auc: 0.834442
[2573]	Train's auc: 0.984876	Evaluation's auc: 0.834424
[2574]	Train's auc: 0.984881	Evaluation's auc: 0.83441
[2575]	Train's auc: 0.984877	Evaluation's auc: 0.834451
[2576]	Train's auc: 0.984895	Evaluation's auc: 0.834457
[2577]	Train's auc: 0.984899	Evaluation's auc: 0.834462
[2578]	Train's auc: 0.984904	Evaluation's auc: 0.834441
[2579]	Train's auc: 0.984904	Evaluation's auc: 0.834428
[2580]	Train's auc: 0.98491	Evaluation's auc: 0.834417
[2581]	Train's auc: 0.984917	Evaluation's auc: 0.834425
[2582]	Train's auc: 0.98492	Evaluation's auc: 0.834442
[2583]	Train's auc: 0.984926	Evaluation's auc: 0.834436
[2584]	Train's auc: 0.984927	Evaluation's auc: 0.834435
[2585]	Train's auc: 0.984933	Evaluation's auc: 0.834419
[2586]	Train's auc: 0.984933	Evaluation's auc: 0.834422
[2587]	Train's auc: 0.984939	Evaluation's auc: 0.834424
[2588]	Train's auc: 0.984941	Evaluation's auc: 0.834414
[2589]	Train's auc: 0.984946	Evaluation's auc: 0.834384
[2590]	Train's auc: 0.984951	Evaluation's auc: 0.83441
[2591]	Train's auc: 0.984976	Evaluation's auc: 0.834371
[2592]	Train's auc: 0.984975	Evaluation's auc: 0.834377
[2593]	Train's auc: 0.984981	Evaluation's auc: 0.834365
[2594]	Train's auc: 0.984989	Evaluation's auc: 0.834357
[2595]	Train's auc: 0.984999	Evaluation's auc: 0.834326
[2596]	Train's auc: 0.984999	Evaluation's auc: 0.834328
[2597]	Train's auc: 0.985002	Evaluation's auc: 0.834309
[2598]	Train's auc: 0.985008	Evaluation's auc: 0.834304
[2599]	Train's auc: 0.985012	Evaluation's auc: 0.83429
[2600]	Train's auc: 0.985015	Evaluation's auc: 0.834294
[2601]	Train's auc: 0.985021	Evaluation's auc: 0.834303
[2602]	Train's auc: 0.985026	Evaluation's auc: 0.834308
[2603]	Train's auc: 0.985028	Evaluation's auc: 0.834289
[2604]	Train's auc: 0.985037	Evaluation's auc: 0.834307
[2605]	Train's auc: 0.985043	Evaluation's auc: 0.83428
[2606]	Train's auc: 0.985054	Evaluation's auc: 0.834243
[2607]	Train's auc: 0.98506	Evaluation's auc: 0.834226
[2608]	Train's auc: 0.985064	Evaluation's auc: 0.834216
[2609]	Train's auc: 0.98507	Evaluation's auc: 0.834232
[2610]	Train's auc: 0.985076	Evaluation's auc: 0.834221
[2611]	Train's auc: 0.985081	Evaluation's auc: 0.834224
[2612]	Train's auc: 0.985081	Evaluation's auc: 0.834246
[2613]	Train's auc: 0.985091	Evaluation's auc: 0.834242
[2614]	Train's auc: 0.985103	Evaluation's auc: 0.83424
[2615]	Train's auc: 0.98511	Evaluation's auc: 0.834242
[2616]	Train's auc: 0.98512	Evaluation's auc: 0.834229
[2617]	Train's auc: 0.985136	Evaluation's auc: 0.834245
[2618]	Train's auc: 0.985141	Evaluation's auc: 0.834243
[2619]	Train's auc: 0.98512	Evaluation's auc: 0.834263
[2620]	Train's auc: 0.985127	Evaluation's auc: 0.834246
[2621]	Train's auc: 0.98514	Evaluation's auc: 0.83425
[2622]	Train's auc: 0.985148	Evaluation's auc: 0.834242
[2623]	Train's auc: 0.985149	Evaluation's auc: 0.834234
[2624]	Train's auc: 0.985154	Evaluation's auc: 0.834239
[2625]	Train's auc: 0.985165	Evaluation's auc: 0.834197
[2626]	Train's auc: 0.985157	Evaluation's auc: 0.83425
[2627]	Train's auc: 0.985163	Evaluation's auc: 0.834245
[2628]	Train's auc: 0.985163	Evaluation's auc: 0.834245
[2629]	Train's auc: 0.98517	Evaluation's auc: 0.834258
[2630]	Train's auc: 0.98517	Evaluation's auc: 0.834258
[2631]	Train's auc: 0.985174	Evaluation's auc: 0.83426
[2632]	Train's auc: 0.985183	Evaluation's auc: 0.834243
[2633]	Train's auc: 0.985191	Evaluation's auc: 0.834253
[2634]	Train's auc: 0.985194	Evaluation's auc: 0.83426
[2635]	Train's auc: 0.985201	Evaluation's auc: 0.834265
[2636]	Train's auc: 0.985207	Evaluation's auc: 0.834291
[2637]	Train's auc: 0.985218	Evaluation's auc: 0.83427
[2638]	Train's auc: 0.985218	Evaluation's auc: 0.83427
[2639]	Train's auc: 0.985225	Evaluation's auc: 0.83427
[2640]	Train's auc: 0.985237	Evaluation's auc: 0.834232
[2641]	Train's auc: 0.985239	Evaluation's auc: 0.834226
[2642]	Train's auc: 0.985234	Evaluation's auc: 0.83424
[2643]	Train's auc: 0.985241	Evaluation's auc: 0.834237
[2644]	Train's auc: 0.985247	Evaluation's auc: 0.834252
[2645]	Train's auc: 0.985252	Evaluation's auc: 0.83428
[2646]	Train's auc: 0.98526	Evaluation's auc: 0.834261
[2647]	Train's auc: 0.98528	Evaluation's auc: 0.834242
[2648]	Train's auc: 0.985296	Evaluation's auc: 0.83422
[2649]	Train's auc: 0.985309	Evaluation's auc: 0.834228
[2650]	Train's auc: 0.985313	Evaluation's auc: 0.834216
[2651]	Train's auc: 0.985314	Evaluation's auc: 0.834215
[2652]	Train's auc: 0.985319	Evaluation's auc: 0.834215
[2653]	Train's auc: 0.985319	Evaluation's auc: 0.83421
[2654]	Train's auc: 0.985321	Evaluation's auc: 0.834203
[2655]	Train's auc: 0.985327	Evaluation's auc: 0.834187
[2656]	Train's auc: 0.985334	Evaluation's auc: 0.834191
[2657]	Train's auc: 0.98534	Evaluation's auc: 0.83417
[2658]	Train's auc: 0.985339	Evaluation's auc: 0.834169
[2659]	Train's auc: 0.985343	Evaluation's auc: 0.834166
[2660]	Train's auc: 0.985345	Evaluation's auc: 0.83418
[2661]	Train's auc: 0.985348	Evaluation's auc: 0.834173
[2662]	Train's auc: 0.985343	Evaluation's auc: 0.834191
[2663]	Train's auc: 0.985347	Evaluation's auc: 0.834204
[2664]	Train's auc: 0.985353	Evaluation's auc: 0.834204
[2665]	Train's auc: 0.985359	Evaluation's auc: 0.834202
[2666]	Train's auc: 0.98536	Evaluation's auc: 0.834196
[2667]	Train's auc: 0.985367	Evaluation's auc: 0.83418
[2668]	Train's auc: 0.985364	Evaluation's auc: 0.834157
[2669]	Train's auc: 0.985368	Evaluation's auc: 0.834144
[2670]	Train's auc: 0.985375	Evaluation's auc: 0.834123
[2671]	Train's auc: 0.985384	Evaluation's auc: 0.834101
[2672]	Train's auc: 0.985392	Evaluation's auc: 0.83408
[2673]	Train's auc: 0.985398	Evaluation's auc: 0.834106
[2674]	Train's auc: 0.985395	Evaluation's auc: 0.834082
[2675]	Train's auc: 0.985408	Evaluation's auc: 0.834051
[2676]	Train's auc: 0.985414	Evaluation's auc: 0.83404
[2677]	Train's auc: 0.98542	Evaluation's auc: 0.834033
[2678]	Train's auc: 0.985423	Evaluation's auc: 0.83402
[2679]	Train's auc: 0.985439	Evaluation's auc: 0.833986
[2680]	Train's auc: 0.985441	Evaluation's auc: 0.833976
[2681]	Train's auc: 0.985448	Evaluation's auc: 0.833985
[2682]	Train's auc: 0.985453	Evaluation's auc: 0.833981
[2683]	Train's auc: 0.985457	Evaluation's auc: 0.83397
[2684]	Train's auc: 0.985458	Evaluation's auc: 0.833964
[2685]	Train's auc: 0.985459	Evaluation's auc: 0.833962
[2686]	Train's auc: 0.985462	Evaluation's auc: 0.833954
[2687]	Train's auc: 0.985468	Evaluation's auc: 0.833928
[2688]	Train's auc: 0.985476	Evaluation's auc: 0.833944
[2689]	Train's auc: 0.985481	Evaluation's auc: 0.833918
[2690]	Train's auc: 0.985496	Evaluation's auc: 0.833896
[2691]	Train's auc: 0.985498	Evaluation's auc: 0.833916
[2692]	Train's auc: 0.985512	Evaluation's auc: 0.833896
[2693]	Train's auc: 0.985522	Evaluation's auc: 0.833882
[2694]	Train's auc: 0.985531	Evaluation's auc: 0.833865
[2695]	Train's auc: 0.985533	Evaluation's auc: 0.83389
[2696]	Train's auc: 0.985536	Evaluation's auc: 0.833884
[2697]	Train's auc: 0.985539	Evaluation's auc: 0.833885
[2698]	Train's auc: 0.985542	Evaluation's auc: 0.833867
[2699]	Train's auc: 0.985549	Evaluation's auc: 0.833904
[2700]	Train's auc: 0.985556	Evaluation's auc: 0.833921
[2701]	Train's auc: 0.985564	Evaluation's auc: 0.833967
[2702]	Train's auc: 0.985571	Evaluation's auc: 0.833948
[2703]	Train's auc: 0.98558	Evaluation's auc: 0.833967
[2704]	Train's auc: 0.985591	Evaluation's auc: 0.833978
[2705]	Train's auc: 0.985592	Evaluation's auc: 0.833954
[2706]	Train's auc: 0.985595	Evaluation's auc: 0.833942
[2707]	Train's auc: 0.985604	Evaluation's auc: 0.833953
[2708]	Train's auc: 0.985611	Evaluation's auc: 0.833925
[2709]	Train's auc: 0.985616	Evaluation's auc: 0.833915
[2710]	Train's auc: 0.985624	Evaluation's auc: 0.833906
[2711]	Train's auc: 0.985635	Evaluation's auc: 0.833848
[2712]	Train's auc: 0.985639	Evaluation's auc: 0.833862
[2713]	Train's auc: 0.985644	Evaluation's auc: 0.833834
[2714]	Train's auc: 0.985646	Evaluation's auc: 0.833825
[2715]	Train's auc: 0.985647	Evaluation's auc: 0.833815
[2716]	Train's auc: 0.985649	Evaluation's auc: 0.833793
[2717]	Train's auc: 0.985653	Evaluation's auc: 0.833789
[2718]	Train's auc: 0.985662	Evaluation's auc: 0.833765
[2719]	Train's auc: 0.985664	Evaluation's auc: 0.833753
[2720]	Train's auc: 0.985671	Evaluation's auc: 0.833724
[2721]	Train's auc: 0.985678	Evaluation's auc: 0.833746
[2722]	Train's auc: 0.985681	Evaluation's auc: 0.833751
[2723]	Train's auc: 0.985693	Evaluation's auc: 0.833711
[2724]	Train's auc: 0.985694	Evaluation's auc: 0.833712
[2725]	Train's auc: 0.985694	Evaluation's auc: 0.83371
[2726]	Train's auc: 0.985703	Evaluation's auc: 0.8337
[2727]	Train's auc: 0.985708	Evaluation's auc: 0.833682
[2728]	Train's auc: 0.98571	Evaluation's auc: 0.833681
[2729]	Train's auc: 0.985713	Evaluation's auc: 0.83369
[2730]	Train's auc: 0.985721	Evaluation's auc: 0.833681
[2731]	Train's auc: 0.985722	Evaluation's auc: 0.833691
[2732]	Train's auc: 0.985727	Evaluation's auc: 0.833692
[2733]	Train's auc: 0.985729	Evaluation's auc: 0.833675
[2734]	Train's auc: 0.985731	Evaluation's auc: 0.83367
[2735]	Train's auc: 0.98573	Evaluation's auc: 0.833671
[2736]	Train's auc: 0.985732	Evaluation's auc: 0.833696
[2737]	Train's auc: 0.985741	Evaluation's auc: 0.833673
[2738]	Train's auc: 0.985745	Evaluation's auc: 0.833687
[2739]	Train's auc: 0.985753	Evaluation's auc: 0.833707
[2740]	Train's auc: 0.985757	Evaluation's auc: 0.833706
[2741]	Train's auc: 0.985759	Evaluation's auc: 0.833692
[2742]	Train's auc: 0.985766	Evaluation's auc: 0.83368
[2743]	Train's auc: 0.985752	Evaluation's auc: 0.833702
[2744]	Train's auc: 0.985788	Evaluation's auc: 0.833709
[2745]	Train's auc: 0.985793	Evaluation's auc: 0.833678
[2746]	Train's auc: 0.985798	Evaluation's auc: 0.833669
[2747]	Train's auc: 0.985799	Evaluation's auc: 0.833651
[2748]	Train's auc: 0.985802	Evaluation's auc: 0.83365
[2749]	Train's auc: 0.985807	Evaluation's auc: 0.833652
[2750]	Train's auc: 0.985812	Evaluation's auc: 0.83367
[2751]	Train's auc: 0.985816	Evaluation's auc: 0.833663
[2752]	Train's auc: 0.985817	Evaluation's auc: 0.833657
[2753]	Train's auc: 0.985818	Evaluation's auc: 0.833619
[2754]	Train's auc: 0.985832	Evaluation's auc: 0.833649
[2755]	Train's auc: 0.985838	Evaluation's auc: 0.833618
[2756]	Train's auc: 0.985843	Evaluation's auc: 0.833588
[2757]	Train's auc: 0.985852	Evaluation's auc: 0.833591
[2758]	Train's auc: 0.985859	Evaluation's auc: 0.833577
[2759]	Train's auc: 0.985868	Evaluation's auc: 0.833575
[2760]	Train's auc: 0.985875	Evaluation's auc: 0.833565
[2761]	Train's auc: 0.985887	Evaluation's auc: 0.833572
[2762]	Train's auc: 0.985889	Evaluation's auc: 0.833565
[2763]	Train's auc: 0.985897	Evaluation's auc: 0.83359
[2764]	Train's auc: 0.985905	Evaluation's auc: 0.833579
[2765]	Train's auc: 0.985907	Evaluation's auc: 0.833569
[2766]	Train's auc: 0.985914	Evaluation's auc: 0.833553
[2767]	Train's auc: 0.985912	Evaluation's auc: 0.833553
[2768]	Train's auc: 0.985913	Evaluation's auc: 0.833563
[2769]	Train's auc: 0.985922	Evaluation's auc: 0.833563
[2770]	Train's auc: 0.985923	Evaluation's auc: 0.833555
[2771]	Train's auc: 0.985929	Evaluation's auc: 0.83355
[2772]	Train's auc: 0.985939	Evaluation's auc: 0.83355
[2773]	Train's auc: 0.985936	Evaluation's auc: 0.833551
[2774]	Train's auc: 0.985935	Evaluation's auc: 0.833542
[2775]	Train's auc: 0.985941	Evaluation's auc: 0.833522
[2776]	Train's auc: 0.985944	Evaluation's auc: 0.833528
[2777]	Train's auc: 0.985952	Evaluation's auc: 0.833547
[2778]	Train's auc: 0.985961	Evaluation's auc: 0.833546
[2779]	Train's auc: 0.985964	Evaluation's auc: 0.833547
[2780]	Train's auc: 0.985965	Evaluation's auc: 0.833557
[2781]	Train's auc: 0.985963	Evaluation's auc: 0.833544
[2782]	Train's auc: 0.985967	Evaluation's auc: 0.833541
[2783]	Train's auc: 0.985969	Evaluation's auc: 0.833545
[2784]	Train's auc: 0.985973	Evaluation's auc: 0.833543
[2785]	Train's auc: 0.985978	Evaluation's auc: 0.833515
[2786]	Train's auc: 0.985985	Evaluation's auc: 0.833484
[2787]	Train's auc: 0.985985	Evaluation's auc: 0.833483
[2788]	Train's auc: 0.985999	Evaluation's auc: 0.833483
[2789]	Train's auc: 0.986004	Evaluation's auc: 0.833447
[2790]	Train's auc: 0.98601	Evaluation's auc: 0.833467
[2791]	Train's auc: 0.986012	Evaluation's auc: 0.833481
[2792]	Train's auc: 0.986012	Evaluation's auc: 0.83348
[2793]	Train's auc: 0.986016	Evaluation's auc: 0.833487
[2794]	Train's auc: 0.986019	Evaluation's auc: 0.833479
[2795]	Train's auc: 0.986022	Evaluation's auc: 0.833468
[2796]	Train's auc: 0.986024	Evaluation's auc: 0.83346
[2797]	Train's auc: 0.986028	Evaluation's auc: 0.833453
[2798]	Train's auc: 0.986036	Evaluation's auc: 0.833452
[2799]	Train's auc: 0.986049	Evaluation's auc: 0.833452
[2800]	Train's auc: 0.986056	Evaluation's auc: 0.833434
[2801]	Train's auc: 0.986055	Evaluation's auc: 0.833396
[2802]	Train's auc: 0.986058	Evaluation's auc: 0.833395
[2803]	Train's auc: 0.98606	Evaluation's auc: 0.833374
[2804]	Train's auc: 0.986063	Evaluation's auc: 0.833365
[2805]	Train's auc: 0.986068	Evaluation's auc: 0.833341
[2806]	Train's auc: 0.986073	Evaluation's auc: 0.833287
[2807]	Train's auc: 0.986078	Evaluation's auc: 0.833273
[2808]	Train's auc: 0.986082	Evaluation's auc: 0.83327
[2809]	Train's auc: 0.986087	Evaluation's auc: 0.833305
[2810]	Train's auc: 0.986087	Evaluation's auc: 0.833302
[2811]	Train's auc: 0.986087	Evaluation's auc: 0.833292
[2812]	Train's auc: 0.986087	Evaluation's auc: 0.833263
[2813]	Train's auc: 0.986097	Evaluation's auc: 0.833253
[2814]	Train's auc: 0.986102	Evaluation's auc: 0.833236
[2815]	Train's auc: 0.986106	Evaluation's auc: 0.833268
[2816]	Train's auc: 0.986112	Evaluation's auc: 0.833286
[2817]	Train's auc: 0.986115	Evaluation's auc: 0.833282
[2818]	Train's auc: 0.986116	Evaluation's auc: 0.833274
[2819]	Train's auc: 0.986123	Evaluation's auc: 0.833269
[2820]	Train's auc: 0.986127	Evaluation's auc: 0.833251
[2821]	Train's auc: 0.986132	Evaluation's auc: 0.833238
[2822]	Train's auc: 0.986134	Evaluation's auc: 0.833241
[2823]	Train's auc: 0.986135	Evaluation's auc: 0.833237
[2824]	Train's auc: 0.986138	Evaluation's auc: 0.83324
[2825]	Train's auc: 0.986145	Evaluation's auc: 0.83325
[2826]	Train's auc: 0.986158	Evaluation's auc: 0.833252
[2827]	Train's auc: 0.986164	Evaluation's auc: 0.833267
[2828]	Train's auc: 0.986172	Evaluation's auc: 0.833267
[2829]	Train's auc: 0.986172	Evaluation's auc: 0.833258
[2830]	Train's auc: 0.986174	Evaluation's auc: 0.833254
[2831]	Train's auc: 0.986178	Evaluation's auc: 0.833273
[2832]	Train's auc: 0.986183	Evaluation's auc: 0.833233
[2833]	Train's auc: 0.986189	Evaluation's auc: 0.83321
[2834]	Train's auc: 0.986201	Evaluation's auc: 0.833204
[2835]	Train's auc: 0.986198	Evaluation's auc: 0.833198
[2836]	Train's auc: 0.986204	Evaluation's auc: 0.833202
[2837]	Train's auc: 0.986204	Evaluation's auc: 0.833215
[2838]	Train's auc: 0.986207	Evaluation's auc: 0.833229
[2839]	Train's auc: 0.98621	Evaluation's auc: 0.833209
[2840]	Train's auc: 0.98621	Evaluation's auc: 0.833207
[2841]	Train's auc: 0.986216	Evaluation's auc: 0.833189
[2842]	Train's auc: 0.986217	Evaluation's auc: 0.833189
[2843]	Train's auc: 0.986231	Evaluation's auc: 0.833173
[2844]	Train's auc: 0.986238	Evaluation's auc: 0.833188
[2845]	Train's auc: 0.986238	Evaluation's auc: 0.833171
[2846]	Train's auc: 0.98624	Evaluation's auc: 0.833197
[2847]	Train's auc: 0.986243	Evaluation's auc: 0.833199
[2848]	Train's auc: 0.986244	Evaluation's auc: 0.833202
[2849]	Train's auc: 0.986256	Evaluation's auc: 0.833174
[2850]	Train's auc: 0.986262	Evaluation's auc: 0.833145
[2851]	Train's auc: 0.986276	Evaluation's auc: 0.833105
[2852]	Train's auc: 0.986281	Evaluation's auc: 0.833116
[2853]	Train's auc: 0.986286	Evaluation's auc: 0.833104
[2854]	Train's auc: 0.986289	Evaluation's auc: 0.833111
[2855]	Train's auc: 0.986295	Evaluation's auc: 0.833109
[2856]	Train's auc: 0.986301	Evaluation's auc: 0.833123
[2857]	Train's auc: 0.9863	Evaluation's auc: 0.833118
[2858]	Train's auc: 0.986312	Evaluation's auc: 0.833108
[2859]	Train's auc: 0.986315	Evaluation's auc: 0.833109
[2860]	Train's auc: 0.986325	Evaluation's auc: 0.833136
[2861]	Train's auc: 0.986332	Evaluation's auc: 0.833102
[2862]	Train's auc: 0.986337	Evaluation's auc: 0.833064
[2863]	Train's auc: 0.986339	Evaluation's auc: 0.833064
[2864]	Train's auc: 0.986338	Evaluation's auc: 0.833066
[2865]	Train's auc: 0.986339	Evaluation's auc: 0.833074
[2866]	Train's auc: 0.986331	Evaluation's auc: 0.833079
[2867]	Train's auc: 0.98633	Evaluation's auc: 0.833075
[2868]	Train's auc: 0.986338	Evaluation's auc: 0.833072
[2869]	Train's auc: 0.986343	Evaluation's auc: 0.833071
[2870]	Train's auc: 0.986344	Evaluation's auc: 0.833058
[2871]	Train's auc: 0.986347	Evaluation's auc: 0.833059
[2872]	Train's auc: 0.986348	Evaluation's auc: 0.833051
[2873]	Train's auc: 0.986362	Evaluation's auc: 0.833023
[2874]	Train's auc: 0.986364	Evaluation's auc: 0.833043
[2875]	Train's auc: 0.986366	Evaluation's auc: 0.833047
[2876]	Train's auc: 0.986369	Evaluation's auc: 0.833055
[2877]	Train's auc: 0.986361	Evaluation's auc: 0.833086
[2878]	Train's auc: 0.986364	Evaluation's auc: 0.833079
[2879]	Train's auc: 0.986372	Evaluation's auc: 0.833033
[2880]	Train's auc: 0.986374	Evaluation's auc: 0.833028
[2881]	Train's auc: 0.986388	Evaluation's auc: 0.833058
[2882]	Train's auc: 0.986392	Evaluation's auc: 0.833055
[2883]	Train's auc: 0.986399	Evaluation's auc: 0.833016
[2884]	Train's auc: 0.986405	Evaluation's auc: 0.83302
[2885]	Train's auc: 0.986418	Evaluation's auc: 0.83299
[2886]	Train's auc: 0.986422	Evaluation's auc: 0.832988
[2887]	Train's auc: 0.986422	Evaluation's auc: 0.832987
[2888]	Train's auc: 0.98643	Evaluation's auc: 0.832977
[2889]	Train's auc: 0.986433	Evaluation's auc: 0.832975
[2890]	Train's auc: 0.986439	Evaluation's auc: 0.832996
[2891]	Train's auc: 0.986443	Evaluation's auc: 0.832998
[2892]	Train's auc: 0.986447	Evaluation's auc: 0.832989
[2893]	Train's auc: 0.986455	Evaluation's auc: 0.832953
[2894]	Train's auc: 0.986459	Evaluation's auc: 0.832935
[2895]	Train's auc: 0.986461	Evaluation's auc: 0.832923
[2896]	Train's auc: 0.986466	Evaluation's auc: 0.832939
[2897]	Train's auc: 0.986467	Evaluation's auc: 0.832942
[2898]	Train's auc: 0.986466	Evaluation's auc: 0.832953
[2899]	Train's auc: 0.98647	Evaluation's auc: 0.832955
[2900]	Train's auc: 0.98647	Evaluation's auc: 0.832951
[2901]	Train's auc: 0.986477	Evaluation's auc: 0.832931
[2902]	Train's auc: 0.986478	Evaluation's auc: 0.83293
[2903]	Train's auc: 0.986481	Evaluation's auc: 0.832898
[2904]	Train's auc: 0.986493	Evaluation's auc: 0.832888
[2905]	Train's auc: 0.986497	Evaluation's auc: 0.832892
[2906]	Train's auc: 0.986499	Evaluation's auc: 0.832883
[2907]	Train's auc: 0.9865	Evaluation's auc: 0.832882
[2908]	Train's auc: 0.986503	Evaluation's auc: 0.832887
[2909]	Train's auc: 0.986507	Evaluation's auc: 0.832897
[2910]	Train's auc: 0.98651	Evaluation's auc: 0.832911
[2911]	Train's auc: 0.986513	Evaluation's auc: 0.832934
[2912]	Train's auc: 0.986513	Evaluation's auc: 0.832937
[2913]	Train's auc: 0.986513	Evaluation's auc: 0.832932
[2914]	Train's auc: 0.986514	Evaluation's auc: 0.832937
[2915]	Train's auc: 0.986531	Evaluation's auc: 0.83293
[2916]	Train's auc: 0.98654	Evaluation's auc: 0.832938
[2917]	Train's auc: 0.986552	Evaluation's auc: 0.832928
[2918]	Train's auc: 0.986554	Evaluation's auc: 0.832941
[2919]	Train's auc: 0.986554	Evaluation's auc: 0.832941
[2920]	Train's auc: 0.986555	Evaluation's auc: 0.832939
[2921]	Train's auc: 0.986561	Evaluation's auc: 0.832951
[2922]	Train's auc: 0.986564	Evaluation's auc: 0.832953
[2923]	Train's auc: 0.986565	Evaluation's auc: 0.832937
[2924]	Train's auc: 0.986569	Evaluation's auc: 0.832917
[2925]	Train's auc: 0.986572	Evaluation's auc: 0.832985
[2926]	Train's auc: 0.986581	Evaluation's auc: 0.832996
[2927]	Train's auc: 0.986583	Evaluation's auc: 0.83298
[2928]	Train's auc: 0.98659	Evaluation's auc: 0.832987
[2929]	Train's auc: 0.986593	Evaluation's auc: 0.832999
[2930]	Train's auc: 0.986598	Evaluation's auc: 0.833003
[2931]	Train's auc: 0.986599	Evaluation's auc: 0.83298
[2932]	Train's auc: 0.986601	Evaluation's auc: 0.833008
[2933]	Train's auc: 0.986607	Evaluation's auc: 0.833
[2934]	Train's auc: 0.986609	Evaluation's auc: 0.832992
[2935]	Train's auc: 0.986617	Evaluation's auc: 0.832979
[2936]	Train's auc: 0.986622	Evaluation's auc: 0.832984
[2937]	Train's auc: 0.986629	Evaluation's auc: 0.832967
[2938]	Train's auc: 0.98663	Evaluation's auc: 0.832972
[2939]	Train's auc: 0.986637	Evaluation's auc: 0.832953
[2940]	Train's auc: 0.986645	Evaluation's auc: 0.832935
[2941]	Train's auc: 0.986662	Evaluation's auc: 0.832937
[2942]	Train's auc: 0.986667	Evaluation's auc: 0.832948
[2943]	Train's auc: 0.986668	Evaluation's auc: 0.832927
[2944]	Train's auc: 0.986667	Evaluation's auc: 0.832934
[2945]	Train's auc: 0.986668	Evaluation's auc: 0.832949
[2946]	Train's auc: 0.986674	Evaluation's auc: 0.832967
[2947]	Train's auc: 0.986679	Evaluation's auc: 0.832963
[2948]	Train's auc: 0.986687	Evaluation's auc: 0.832977
[2949]	Train's auc: 0.986687	Evaluation's auc: 0.832965
[2950]	Train's auc: 0.986689	Evaluation's auc: 0.832962
[2951]	Train's auc: 0.986691	Evaluation's auc: 0.832958
[2952]	Train's auc: 0.986693	Evaluation's auc: 0.832967
[2953]	Train's auc: 0.986694	Evaluation's auc: 0.832992
[2954]	Train's auc: 0.986697	Evaluation's auc: 0.833002
[2955]	Train's auc: 0.986711	Evaluation's auc: 0.833006
[2956]	Train's auc: 0.986718	Evaluation's auc: 0.833002
[2957]	Train's auc: 0.986717	Evaluation's auc: 0.833004
[2958]	Train's auc: 0.98672	Evaluation's auc: 0.832998
[2959]	Train's auc: 0.986725	Evaluation's auc: 0.833001
[2960]	Train's auc: 0.986732	Evaluation's auc: 0.832978
[2961]	Train's auc: 0.986741	Evaluation's auc: 0.832975
[2962]	Train's auc: 0.986744	Evaluation's auc: 0.833013
[2963]	Train's auc: 0.986739	Evaluation's auc: 0.833004
[2964]	Train's auc: 0.986745	Evaluation's auc: 0.833007
[2965]	Train's auc: 0.986746	Evaluation's auc: 0.833016
[2966]	Train's auc: 0.986752	Evaluation's auc: 0.833014
[2967]	Train's auc: 0.986755	Evaluation's auc: 0.833006
[2968]	Train's auc: 0.986755	Evaluation's auc: 0.833002
[2969]	Train's auc: 0.98676	Evaluation's auc: 0.833037
[2970]	Train's auc: 0.986765	Evaluation's auc: 0.833046
[2971]	Train's auc: 0.986767	Evaluation's auc: 0.833047
[2972]	Train's auc: 0.986767	Evaluation's auc: 0.833046
[2973]	Train's auc: 0.986775	Evaluation's auc: 0.833048
[2974]	Train's auc: 0.986777	Evaluation's auc: 0.833063
[2975]	Train's auc: 0.986782	Evaluation's auc: 0.833061
[2976]	Train's auc: 0.986787	Evaluation's auc: 0.833066
[2977]	Train's auc: 0.986795	Evaluation's auc: 0.833047
[2978]	Train's auc: 0.986811	Evaluation's auc: 0.83306
[2979]	Train's auc: 0.986812	Evaluation's auc: 0.833069
[2980]	Train's auc: 0.986816	Evaluation's auc: 0.833061
[2981]	Train's auc: 0.986823	Evaluation's auc: 0.833008
[2982]	Train's auc: 0.986827	Evaluation's auc: 0.833005
[2983]	Train's auc: 0.986827	Evaluation's auc: 0.833014
[2984]	Train's auc: 0.986834	Evaluation's auc: 0.832991
[2985]	Train's auc: 0.986839	Evaluation's auc: 0.832988
[2986]	Train's auc: 0.98684	Evaluation's auc: 0.832981
[2987]	Train's auc: 0.986844	Evaluation's auc: 0.832977
[2988]	Train's auc: 0.986844	Evaluation's auc: 0.832961
[2989]	Train's auc: 0.986849	Evaluation's auc: 0.832962
[2990]	Train's auc: 0.986851	Evaluation's auc: 0.832961
[2991]	Train's auc: 0.986854	Evaluation's auc: 0.832968
[2992]	Train's auc: 0.986862	Evaluation's auc: 0.832952
[2993]	Train's auc: 0.986868	Evaluation's auc: 0.832989
[2994]	Train's auc: 0.986873	Evaluation's auc: 0.832972
[2995]	Train's auc: 0.986874	Evaluation's auc: 0.832965
[2996]	Train's auc: 0.986876	Evaluation's auc: 0.832961
[2997]	Train's auc: 0.986876	Evaluation's auc: 0.832966
[2998]	Train's auc: 0.986877	Evaluation's auc: 0.83296
[2999]	Train's auc: 0.986879	Evaluation's auc: 0.832971
[3000]	Train's auc: 0.98689	Evaluation's auc: 0.832971
[3001]	Train's auc: 0.986893	Evaluation's auc: 0.832962
[3002]	Train's auc: 0.986902	Evaluation's auc: 0.832964
[3003]	Train's auc: 0.986903	Evaluation's auc: 0.832979
[3004]	Train's auc: 0.986909	Evaluation's auc: 0.833014
[3005]	Train's auc: 0.986915	Evaluation's auc: 0.832987
[3006]	Train's auc: 0.986923	Evaluation's auc: 0.832971
[3007]	Train's auc: 0.986925	Evaluation's auc: 0.832974
[3008]	Train's auc: 0.986928	Evaluation's auc: 0.832989
[3009]	Train's auc: 0.986927	Evaluation's auc: 0.832991
[3010]	Train's auc: 0.986929	Evaluation's auc: 0.832994
[3011]	Train's auc: 0.986934	Evaluation's auc: 0.832994
[3012]	Train's auc: 0.986936	Evaluation's auc: 0.832972
[3013]	Train's auc: 0.986941	Evaluation's auc: 0.832984
[3014]	Train's auc: 0.986946	Evaluation's auc: 0.832967
[3015]	Train's auc: 0.986948	Evaluation's auc: 0.832973
[3016]	Train's auc: 0.986961	Evaluation's auc: 0.832964
[3017]	Train's auc: 0.986956	Evaluation's auc: 0.832944
[3018]	Train's auc: 0.986964	Evaluation's auc: 0.832941
[3019]	Train's auc: 0.986968	Evaluation's auc: 0.832924
[3020]	Train's auc: 0.986972	Evaluation's auc: 0.832916
[3021]	Train's auc: 0.986972	Evaluation's auc: 0.832912
[3022]	Train's auc: 0.98698	Evaluation's auc: 0.832914
[3023]	Train's auc: 0.986991	Evaluation's auc: 0.832877
[3024]	Train's auc: 0.986992	Evaluation's auc: 0.832877
[3025]	Train's auc: 0.986993	Evaluation's auc: 0.832871
[3026]	Train's auc: 0.986994	Evaluation's auc: 0.832866
[3027]	Train's auc: 0.987003	Evaluation's auc: 0.832831
[3028]	Train's auc: 0.987006	Evaluation's auc: 0.832805
[3029]	Train's auc: 0.987009	Evaluation's auc: 0.832804
[3030]	Train's auc: 0.987009	Evaluation's auc: 0.832799
[3031]	Train's auc: 0.987013	Evaluation's auc: 0.832806
[3032]	Train's auc: 0.987011	Evaluation's auc: 0.832806
[3033]	Train's auc: 0.987011	Evaluation's auc: 0.832804
[3034]	Train's auc: 0.987014	Evaluation's auc: 0.83279
[3035]	Train's auc: 0.987019	Evaluation's auc: 0.832795
[3036]	Train's auc: 0.98703	Evaluation's auc: 0.832802
[3037]	Train's auc: 0.987031	Evaluation's auc: 0.832787
[3038]	Train's auc: 0.987034	Evaluation's auc: 0.832789
[3039]	Train's auc: 0.987041	Evaluation's auc: 0.83278
[3040]	Train's auc: 0.987042	Evaluation's auc: 0.832796
[3041]	Train's auc: 0.987045	Evaluation's auc: 0.8328
[3042]	Train's auc: 0.987038	Evaluation's auc: 0.832825
[3043]	Train's auc: 0.987041	Evaluation's auc: 0.832828
[3044]	Train's auc: 0.987047	Evaluation's auc: 0.832823
[3045]	Train's auc: 0.987054	Evaluation's auc: 0.832846
[3046]	Train's auc: 0.987048	Evaluation's auc: 0.832872
[3047]	Train's auc: 0.98705	Evaluation's auc: 0.832869
[3048]	Train's auc: 0.987052	Evaluation's auc: 0.832879
[3049]	Train's auc: 0.987062	Evaluation's auc: 0.832872
[3050]	Train's auc: 0.987063	Evaluation's auc: 0.832859
[3051]	Train's auc: 0.987068	Evaluation's auc: 0.832867
[3052]	Train's auc: 0.987075	Evaluation's auc: 0.832884
[3053]	Train's auc: 0.987078	Evaluation's auc: 0.832889
[3054]	Train's auc: 0.987079	Evaluation's auc: 0.832897
[3055]	Train's auc: 0.987079	Evaluation's auc: 0.832897
[3056]	Train's auc: 0.987085	Evaluation's auc: 0.832882
[3057]	Train's auc: 0.987088	Evaluation's auc: 0.832865
[3058]	Train's auc: 0.987094	Evaluation's auc: 0.832857
[3059]	Train's auc: 0.987096	Evaluation's auc: 0.83286
[3060]	Train's auc: 0.987103	Evaluation's auc: 0.832839
[3061]	Train's auc: 0.9871	Evaluation's auc: 0.832834
[3062]	Train's auc: 0.987105	Evaluation's auc: 0.832827
[3063]	Train's auc: 0.98711	Evaluation's auc: 0.832838
[3064]	Train's auc: 0.98711	Evaluation's auc: 0.832837
[3065]	Train's auc: 0.987112	Evaluation's auc: 0.832832
[3066]	Train's auc: 0.987123	Evaluation's auc: 0.832841
[3067]	Train's auc: 0.987129	Evaluation's auc: 0.832869
[3068]	Train's auc: 0.987129	Evaluation's auc: 0.832864
[3069]	Train's auc: 0.987136	Evaluation's auc: 0.83286
[3070]	Train's auc: 0.987145	Evaluation's auc: 0.832854
[3071]	Train's auc: 0.987152	Evaluation's auc: 0.832854
[3072]	Train's auc: 0.987156	Evaluation's auc: 0.832857
[3073]	Train's auc: 0.987162	Evaluation's auc: 0.832855
[3074]	Train's auc: 0.98716	Evaluation's auc: 0.832838
[3075]	Train's auc: 0.987166	Evaluation's auc: 0.832823
[3076]	Train's auc: 0.987167	Evaluation's auc: 0.832832
[3077]	Train's auc: 0.987167	Evaluation's auc: 0.832832
[3078]	Train's auc: 0.987172	Evaluation's auc: 0.832849
[3079]	Train's auc: 0.98717	Evaluation's auc: 0.832873
[3080]	Train's auc: 0.987177	Evaluation's auc: 0.83288
[3081]	Train's auc: 0.987179	Evaluation's auc: 0.832872
[3082]	Train's auc: 0.987188	Evaluation's auc: 0.832849
[3083]	Train's auc: 0.987191	Evaluation's auc: 0.832855
[3084]	Train's auc: 0.987199	Evaluation's auc: 0.832837
[3085]	Train's auc: 0.987201	Evaluation's auc: 0.832852
[3086]	Train's auc: 0.987204	Evaluation's auc: 0.832856
[3087]	Train's auc: 0.987207	Evaluation's auc: 0.832886
[3088]	Train's auc: 0.987212	Evaluation's auc: 0.832869
[3089]	Train's auc: 0.987217	Evaluation's auc: 0.832865
[3090]	Train's auc: 0.987221	Evaluation's auc: 0.832857
[3091]	Train's auc: 0.987224	Evaluation's auc: 0.832846
[3092]	Train's auc: 0.987226	Evaluation's auc: 0.832843
[3093]	Train's auc: 0.987228	Evaluation's auc: 0.832835
[3094]	Train's auc: 0.987232	Evaluation's auc: 0.832835
[3095]	Train's auc: 0.987233	Evaluation's auc: 0.832836
[3096]	Train's auc: 0.987233	Evaluation's auc: 0.832806
[3097]	Train's auc: 0.987239	Evaluation's auc: 0.832826
[3098]	Train's auc: 0.987239	Evaluation's auc: 0.832828
[3099]	Train's auc: 0.987241	Evaluation's auc: 0.832815
[3100]	Train's auc: 0.987243	Evaluation's auc: 0.832813
[3101]	Train's auc: 0.987247	Evaluation's auc: 0.832793
[3102]	Train's auc: 0.987254	Evaluation's auc: 0.832805
[3103]	Train's auc: 0.987257	Evaluation's auc: 0.832799
[3104]	Train's auc: 0.987268	Evaluation's auc: 0.832786
[3105]	Train's auc: 0.987285	Evaluation's auc: 0.832772
[3106]	Train's auc: 0.987287	Evaluation's auc: 0.832763
[3107]	Train's auc: 0.987288	Evaluation's auc: 0.832772
[3108]	Train's auc: 0.987291	Evaluation's auc: 0.83278
[3109]	Train's auc: 0.987284	Evaluation's auc: 0.832804
[3110]	Train's auc: 0.987286	Evaluation's auc: 0.832846
[3111]	Train's auc: 0.987289	Evaluation's auc: 0.832826
[3112]	Train's auc: 0.987297	Evaluation's auc: 0.832839
[3113]	Train's auc: 0.9873	Evaluation's auc: 0.832841
[3114]	Train's auc: 0.98731	Evaluation's auc: 0.832856
[3115]	Train's auc: 0.987313	Evaluation's auc: 0.832876
[3116]	Train's auc: 0.987324	Evaluation's auc: 0.832875
[3117]	Train's auc: 0.987329	Evaluation's auc: 0.832863
[3118]	Train's auc: 0.987329	Evaluation's auc: 0.83285
[3119]	Train's auc: 0.987333	Evaluation's auc: 0.832848
[3120]	Train's auc: 0.987336	Evaluation's auc: 0.832845
[3121]	Train's auc: 0.98734	Evaluation's auc: 0.832823
[3122]	Train's auc: 0.987349	Evaluation's auc: 0.832782
[3123]	Train's auc: 0.98735	Evaluation's auc: 0.832779
[3124]	Train's auc: 0.987359	Evaluation's auc: 0.83275
[3125]	Train's auc: 0.987362	Evaluation's auc: 0.832746
[3126]	Train's auc: 0.987368	Evaluation's auc: 0.832733
[3127]	Train's auc: 0.987373	Evaluation's auc: 0.832743
[3128]	Train's auc: 0.987375	Evaluation's auc: 0.832744
[3129]	Train's auc: 0.987377	Evaluation's auc: 0.832734
[3130]	Train's auc: 0.98738	Evaluation's auc: 0.832728
[3131]	Train's auc: 0.987387	Evaluation's auc: 0.832687
[3132]	Train's auc: 0.987392	Evaluation's auc: 0.832709
[3133]	Train's auc: 0.987393	Evaluation's auc: 0.832707
[3134]	Train's auc: 0.987393	Evaluation's auc: 0.832693
[3135]	Train's auc: 0.9874	Evaluation's auc: 0.832695
[3136]	Train's auc: 0.987408	Evaluation's auc: 0.8327
[3137]	Train's auc: 0.987411	Evaluation's auc: 0.832679
[3138]	Train's auc: 0.987423	Evaluation's auc: 0.832666
[3139]	Train's auc: 0.98743	Evaluation's auc: 0.832668
[3140]	Train's auc: 0.987435	Evaluation's auc: 0.832685
[3141]	Train's auc: 0.987425	Evaluation's auc: 0.832692
[3142]	Train's auc: 0.987434	Evaluation's auc: 0.832691
[3143]	Train's auc: 0.987435	Evaluation's auc: 0.832676
[3144]	Train's auc: 0.987442	Evaluation's auc: 0.832657
[3145]	Train's auc: 0.987448	Evaluation's auc: 0.832654
[3146]	Train's auc: 0.987454	Evaluation's auc: 0.832633
[3147]	Train's auc: 0.98745	Evaluation's auc: 0.832629
[3148]	Train's auc: 0.987467	Evaluation's auc: 0.832622
[3149]	Train's auc: 0.987464	Evaluation's auc: 0.832622
[3150]	Train's auc: 0.987461	Evaluation's auc: 0.832619
[3151]	Train's auc: 0.987461	Evaluation's auc: 0.832612
[3152]	Train's auc: 0.987468	Evaluation's auc: 0.8326
[3153]	Train's auc: 0.987473	Evaluation's auc: 0.832598
[3154]	Train's auc: 0.987476	Evaluation's auc: 0.832605
[3155]	Train's auc: 0.987479	Evaluation's auc: 0.832595
[3156]	Train's auc: 0.987473	Evaluation's auc: 0.832604
[3157]	Train's auc: 0.987479	Evaluation's auc: 0.832587
[3158]	Train's auc: 0.987479	Evaluation's auc: 0.832594
[3159]	Train's auc: 0.987475	Evaluation's auc: 0.832576
[3160]	Train's auc: 0.987481	Evaluation's auc: 0.832549
[3161]	Train's auc: 0.987489	Evaluation's auc: 0.832524
[3162]	Train's auc: 0.987488	Evaluation's auc: 0.832511
[3163]	Train's auc: 0.987493	Evaluation's auc: 0.832506
[3164]	Train's auc: 0.987497	Evaluation's auc: 0.83249
[3165]	Train's auc: 0.987512	Evaluation's auc: 0.832497
[3166]	Train's auc: 0.987517	Evaluation's auc: 0.832475
[3167]	Train's auc: 0.98752	Evaluation's auc: 0.832472
[3168]	Train's auc: 0.987518	Evaluation's auc: 0.832475
[3169]	Train's auc: 0.987517	Evaluation's auc: 0.832463
[3170]	Train's auc: 0.987522	Evaluation's auc: 0.832477
[3171]	Train's auc: 0.987525	Evaluation's auc: 0.832478
[3172]	Train's auc: 0.987524	Evaluation's auc: 0.832481
[3173]	Train's auc: 0.987526	Evaluation's auc: 0.832451
[3174]	Train's auc: 0.987527	Evaluation's auc: 0.832456
[3175]	Train's auc: 0.987523	Evaluation's auc: 0.832457
[3176]	Train's auc: 0.987526	Evaluation's auc: 0.832459
[3177]	Train's auc: 0.987534	Evaluation's auc: 0.832433
[3178]	Train's auc: 0.987536	Evaluation's auc: 0.832448
[3179]	Train's auc: 0.987535	Evaluation's auc: 0.832446
[3180]	Train's auc: 0.987533	Evaluation's auc: 0.832461
[3181]	Train's auc: 0.987535	Evaluation's auc: 0.832459
[3182]	Train's auc: 0.987547	Evaluation's auc: 0.832458
[3183]	Train's auc: 0.98756	Evaluation's auc: 0.832456
[3184]	Train's auc: 0.987563	Evaluation's auc: 0.832492
[3185]	Train's auc: 0.987564	Evaluation's auc: 0.832492
[3186]	Train's auc: 0.98757	Evaluation's auc: 0.832487
[3187]	Train's auc: 0.987566	Evaluation's auc: 0.832543
[3188]	Train's auc: 0.987572	Evaluation's auc: 0.832541
[3189]	Train's auc: 0.987575	Evaluation's auc: 0.832534
[3190]	Train's auc: 0.987578	Evaluation's auc: 0.832509
[3191]	Train's auc: 0.987584	Evaluation's auc: 0.832486
[3192]	Train's auc: 0.987589	Evaluation's auc: 0.832504
[3193]	Train's auc: 0.987587	Evaluation's auc: 0.832502
[3194]	Train's auc: 0.987594	Evaluation's auc: 0.832472
[3195]	Train's auc: 0.987598	Evaluation's auc: 0.832482
[3196]	Train's auc: 0.987606	Evaluation's auc: 0.832482
[3197]	Train's auc: 0.98761	Evaluation's auc: 0.832471
[3198]	Train's auc: 0.987614	Evaluation's auc: 0.832468
[3199]	Train's auc: 0.987616	Evaluation's auc: 0.832447
[3200]	Train's auc: 0.987615	Evaluation's auc: 0.832445
[3201]	Train's auc: 0.98762	Evaluation's auc: 0.832458
[3202]	Train's auc: 0.987624	Evaluation's auc: 0.832444
[3203]	Train's auc: 0.98763	Evaluation's auc: 0.832424
[3204]	Train's auc: 0.98763	Evaluation's auc: 0.832439
[3205]	Train's auc: 0.987633	Evaluation's auc: 0.832467
[3206]	Train's auc: 0.987632	Evaluation's auc: 0.832457
[3207]	Train's auc: 0.987637	Evaluation's auc: 0.832421
[3208]	Train's auc: 0.987636	Evaluation's auc: 0.832431
[3209]	Train's auc: 0.987636	Evaluation's auc: 0.832426
[3210]	Train's auc: 0.98764	Evaluation's auc: 0.832429
[3211]	Train's auc: 0.98765	Evaluation's auc: 0.83241
[3212]	Train's auc: 0.987652	Evaluation's auc: 0.832433
[3213]	Train's auc: 0.987658	Evaluation's auc: 0.832413
[3214]	Train's auc: 0.987659	Evaluation's auc: 0.832431
[3215]	Train's auc: 0.987655	Evaluation's auc: 0.832437
[3216]	Train's auc: 0.987658	Evaluation's auc: 0.832424
[3217]	Train's auc: 0.987662	Evaluation's auc: 0.832411
[3218]	Train's auc: 0.987666	Evaluation's auc: 0.832416
[3219]	Train's auc: 0.987667	Evaluation's auc: 0.832422
[3220]	Train's auc: 0.987669	Evaluation's auc: 0.832454
[3221]	Train's auc: 0.987672	Evaluation's auc: 0.832456
[3222]	Train's auc: 0.987677	Evaluation's auc: 0.832449
[3223]	Train's auc: 0.987682	Evaluation's auc: 0.83243
[3224]	Train's auc: 0.987686	Evaluation's auc: 0.832462
[3225]	Train's auc: 0.987704	Evaluation's auc: 0.832455
[3226]	Train's auc: 0.987707	Evaluation's auc: 0.832467
[3227]	Train's auc: 0.987711	Evaluation's auc: 0.832446
[3228]	Train's auc: 0.987715	Evaluation's auc: 0.832447
[3229]	Train's auc: 0.987718	Evaluation's auc: 0.83245
[3230]	Train's auc: 0.987722	Evaluation's auc: 0.832425
[3231]	Train's auc: 0.987726	Evaluation's auc: 0.832443
[3232]	Train's auc: 0.98773	Evaluation's auc: 0.832432
[3233]	Train's auc: 0.98773	Evaluation's auc: 0.832434
[3234]	Train's auc: 0.987723	Evaluation's auc: 0.832431
[3235]	Train's auc: 0.987725	Evaluation's auc: 0.832431
[3236]	Train's auc: 0.987736	Evaluation's auc: 0.832431
[3237]	Train's auc: 0.987737	Evaluation's auc: 0.832425
[3238]	Train's auc: 0.987736	Evaluation's auc: 0.832423
[3239]	Train's auc: 0.987741	Evaluation's auc: 0.832426
[3240]	Train's auc: 0.987741	Evaluation's auc: 0.83242
[3241]	Train's auc: 0.987747	Evaluation's auc: 0.832441
[3242]	Train's auc: 0.987747	Evaluation's auc: 0.832447
[3243]	Train's auc: 0.987746	Evaluation's auc: 0.832451
[3244]	Train's auc: 0.987758	Evaluation's auc: 0.832437
[3245]	Train's auc: 0.987752	Evaluation's auc: 0.832437
[3246]	Train's auc: 0.987757	Evaluation's auc: 0.832435
[3247]	Train's auc: 0.987763	Evaluation's auc: 0.83242
[3248]	Train's auc: 0.987763	Evaluation's auc: 0.832423
[3249]	Train's auc: 0.98777	Evaluation's auc: 0.832427
[3250]	Train's auc: 0.987779	Evaluation's auc: 0.832429
[3251]	Train's auc: 0.98778	Evaluation's auc: 0.832438
[3252]	Train's auc: 0.987786	Evaluation's auc: 0.832454
[3253]	Train's auc: 0.987791	Evaluation's auc: 0.832422
[3254]	Train's auc: 0.987792	Evaluation's auc: 0.832435
[3255]	Train's auc: 0.987797	Evaluation's auc: 0.83244
[3256]	Train's auc: 0.987804	Evaluation's auc: 0.832428
[3257]	Train's auc: 0.987806	Evaluation's auc: 0.832424
[3258]	Train's auc: 0.987808	Evaluation's auc: 0.832417
[3259]	Train's auc: 0.987817	Evaluation's auc: 0.832399
[3260]	Train's auc: 0.987819	Evaluation's auc: 0.832392
[3261]	Train's auc: 0.987825	Evaluation's auc: 0.832366
[3262]	Train's auc: 0.987838	Evaluation's auc: 0.832365
[3263]	Train's auc: 0.987839	Evaluation's auc: 0.832359
[3264]	Train's auc: 0.987842	Evaluation's auc: 0.832376
[3265]	Train's auc: 0.987856	Evaluation's auc: 0.832368
[3266]	Train's auc: 0.987853	Evaluation's auc: 0.832378
[3267]	Train's auc: 0.987856	Evaluation's auc: 0.832383
[3268]	Train's auc: 0.987874	Evaluation's auc: 0.832405
[3269]	Train's auc: 0.987892	Evaluation's auc: 0.832376
[3270]	Train's auc: 0.987897	Evaluation's auc: 0.832365
[3271]	Train's auc: 0.987901	Evaluation's auc: 0.832338
[3272]	Train's auc: 0.987901	Evaluation's auc: 0.832314
[3273]	Train's auc: 0.987903	Evaluation's auc: 0.832305
[3274]	Train's auc: 0.987903	Evaluation's auc: 0.832305
[3275]	Train's auc: 0.987908	Evaluation's auc: 0.832292
[3276]	Train's auc: 0.987913	Evaluation's auc: 0.832293
[3277]	Train's auc: 0.987913	Evaluation's auc: 0.832298
[3278]	Train's auc: 0.987918	Evaluation's auc: 0.832292
[3279]	Train's auc: 0.987921	Evaluation's auc: 0.832313
[3280]	Train's auc: 0.987926	Evaluation's auc: 0.832342
[3281]	Train's auc: 0.987927	Evaluation's auc: 0.83234
[3282]	Train's auc: 0.987924	Evaluation's auc: 0.832354
[3283]	Train's auc: 0.987925	Evaluation's auc: 0.83235
[3284]	Train's auc: 0.987922	Evaluation's auc: 0.832351
[3285]	Train's auc: 0.987928	Evaluation's auc: 0.832352
[3286]	Train's auc: 0.987929	Evaluation's auc: 0.832357
[3287]	Train's auc: 0.987934	Evaluation's auc: 0.832379
[3288]	Train's auc: 0.987942	Evaluation's auc: 0.83237
[3289]	Train's auc: 0.987947	Evaluation's auc: 0.832365
[3290]	Train's auc: 0.987947	Evaluation's auc: 0.832367
[3291]	Train's auc: 0.98795	Evaluation's auc: 0.832376
[3292]	Train's auc: 0.987957	Evaluation's auc: 0.83236
[3293]	Train's auc: 0.987958	Evaluation's auc: 0.832352
[3294]	Train's auc: 0.987961	Evaluation's auc: 0.832358
[3295]	Train's auc: 0.98797	Evaluation's auc: 0.832332
[3296]	Train's auc: 0.98797	Evaluation's auc: 0.83233
[3297]	Train's auc: 0.98797	Evaluation's auc: 0.832346
[3298]	Train's auc: 0.987974	Evaluation's auc: 0.832323
[3299]	Train's auc: 0.98797	Evaluation's auc: 0.832303
[3300]	Train's auc: 0.987976	Evaluation's auc: 0.832289
[3301]	Train's auc: 0.987979	Evaluation's auc: 0.832302
[3302]	Train's auc: 0.987988	Evaluation's auc: 0.832279
[3303]	Train's auc: 0.987992	Evaluation's auc: 0.832293
[3304]	Train's auc: 0.987982	Evaluation's auc: 0.832297
[3305]	Train's auc: 0.987987	Evaluation's auc: 0.832307
[3306]	Train's auc: 0.987993	Evaluation's auc: 0.832312
[3307]	Train's auc: 0.987993	Evaluation's auc: 0.832316
[3308]	Train's auc: 0.987999	Evaluation's auc: 0.832338
[3309]	Train's auc: 0.988002	Evaluation's auc: 0.832328
[3310]	Train's auc: 0.988006	Evaluation's auc: 0.832327
[3311]	Train's auc: 0.98801	Evaluation's auc: 0.832313
[3312]	Train's auc: 0.988012	Evaluation's auc: 0.832325
[3313]	Train's auc: 0.988013	Evaluation's auc: 0.832309
[3314]	Train's auc: 0.988014	Evaluation's auc: 0.832311
[3315]	Train's auc: 0.987999	Evaluation's auc: 0.832309
[3316]	Train's auc: 0.988005	Evaluation's auc: 0.832303
[3317]	Train's auc: 0.988007	Evaluation's auc: 0.832302
[3318]	Train's auc: 0.988009	Evaluation's auc: 0.832288
[3319]	Train's auc: 0.988027	Evaluation's auc: 0.832295
[3320]	Train's auc: 0.988037	Evaluation's auc: 0.83228
[3321]	Train's auc: 0.988052	Evaluation's auc: 0.832274
[3322]	Train's auc: 0.988056	Evaluation's auc: 0.832281
[3323]	Train's auc: 0.988059	Evaluation's auc: 0.832287
[3324]	Train's auc: 0.988059	Evaluation's auc: 0.832288
[3325]	Train's auc: 0.988066	Evaluation's auc: 0.832258
[3326]	Train's auc: 0.988069	Evaluation's auc: 0.832248
[3327]	Train's auc: 0.988069	Evaluation's auc: 0.832279
[3328]	Train's auc: 0.988074	Evaluation's auc: 0.832261
[3329]	Train's auc: 0.988078	Evaluation's auc: 0.832262
[3330]	Train's auc: 0.988077	Evaluation's auc: 0.832254
[3331]	Train's auc: 0.988079	Evaluation's auc: 0.832272
[3332]	Train's auc: 0.988083	Evaluation's auc: 0.83227
[3333]	Train's auc: 0.98808	Evaluation's auc: 0.832253
[3334]	Train's auc: 0.988083	Evaluation's auc: 0.832254
[3335]	Train's auc: 0.988084	Evaluation's auc: 0.832251
[3336]	Train's auc: 0.988086	Evaluation's auc: 0.832244
[3337]	Train's auc: 0.98809	Evaluation's auc: 0.832229
[3338]	Train's auc: 0.988096	Evaluation's auc: 0.832249
[3339]	Train's auc: 0.988098	Evaluation's auc: 0.832228
[3340]	Train's auc: 0.9881	Evaluation's auc: 0.832215
[3341]	Train's auc: 0.988102	Evaluation's auc: 0.832209
[3342]	Train's auc: 0.988107	Evaluation's auc: 0.832192
[3343]	Train's auc: 0.988108	Evaluation's auc: 0.832179
[3344]	Train's auc: 0.988109	Evaluation's auc: 0.832164
[3345]	Train's auc: 0.988114	Evaluation's auc: 0.832176
[3346]	Train's auc: 0.988115	Evaluation's auc: 0.832166
[3347]	Train's auc: 0.988109	Evaluation's auc: 0.832171
[3348]	Train's auc: 0.988112	Evaluation's auc: 0.832166
[3349]	Train's auc: 0.988114	Evaluation's auc: 0.832182
[3350]	Train's auc: 0.988113	Evaluation's auc: 0.832172
[3351]	Train's auc: 0.988106	Evaluation's auc: 0.832208
[3352]	Train's auc: 0.98811	Evaluation's auc: 0.832188
[3353]	Train's auc: 0.988112	Evaluation's auc: 0.832186
[3354]	Train's auc: 0.988113	Evaluation's auc: 0.832197
[3355]	Train's auc: 0.988116	Evaluation's auc: 0.832191
[3356]	Train's auc: 0.988127	Evaluation's auc: 0.832204
[3357]	Train's auc: 0.988128	Evaluation's auc: 0.832195
[3358]	Train's auc: 0.988129	Evaluation's auc: 0.832214
[3359]	Train's auc: 0.988134	Evaluation's auc: 0.8322
[3360]	Train's auc: 0.988144	Evaluation's auc: 0.832197
[3361]	Train's auc: 0.988148	Evaluation's auc: 0.832187
[3362]	Train's auc: 0.988154	Evaluation's auc: 0.832179
[3363]	Train's auc: 0.988157	Evaluation's auc: 0.832176
[3364]	Train's auc: 0.988162	Evaluation's auc: 0.832171
[3365]	Train's auc: 0.988164	Evaluation's auc: 0.832169
[3366]	Train's auc: 0.988167	Evaluation's auc: 0.832165
[3367]	Train's auc: 0.988174	Evaluation's auc: 0.832175
[3368]	Train's auc: 0.988171	Evaluation's auc: 0.832177
[3369]	Train's auc: 0.98818	Evaluation's auc: 0.832197
[3370]	Train's auc: 0.988187	Evaluation's auc: 0.832174
[3371]	Train's auc: 0.988195	Evaluation's auc: 0.832157
[3372]	Train's auc: 0.988201	Evaluation's auc: 0.832153
[3373]	Train's auc: 0.988204	Evaluation's auc: 0.832145
[3374]	Train's auc: 0.988205	Evaluation's auc: 0.832128
[3375]	Train's auc: 0.988215	Evaluation's auc: 0.832112
[3376]	Train's auc: 0.98822	Evaluation's auc: 0.832091
[3377]	Train's auc: 0.988222	Evaluation's auc: 0.83209
[3378]	Train's auc: 0.988228	Evaluation's auc: 0.832075
[3379]	Train's auc: 0.988231	Evaluation's auc: 0.832054
[3380]	Train's auc: 0.988244	Evaluation's auc: 0.832053
[3381]	Train's auc: 0.988246	Evaluation's auc: 0.832072
[3382]	Train's auc: 0.988248	Evaluation's auc: 0.832068
[3383]	Train's auc: 0.988252	Evaluation's auc: 0.832049
[3384]	Train's auc: 0.988252	Evaluation's auc: 0.832056
[3385]	Train's auc: 0.988253	Evaluation's auc: 0.83205
[3386]	Train's auc: 0.988255	Evaluation's auc: 0.83205
[3387]	Train's auc: 0.988258	Evaluation's auc: 0.832025
[3388]	Train's auc: 0.988261	Evaluation's auc: 0.832039
[3389]	Train's auc: 0.988267	Evaluation's auc: 0.832026
[3390]	Train's auc: 0.988267	Evaluation's auc: 0.83202
[3391]	Train's auc: 0.98827	Evaluation's auc: 0.832013
[3392]	Train's auc: 0.988267	Evaluation's auc: 0.832019
[3393]	Train's auc: 0.988269	Evaluation's auc: 0.832036
[3394]	Train's auc: 0.988279	Evaluation's auc: 0.832035
[3395]	Train's auc: 0.988286	Evaluation's auc: 0.832011
[3396]	Train's auc: 0.988288	Evaluation's auc: 0.832002
[3397]	Train's auc: 0.988289	Evaluation's auc: 0.832003
[3398]	Train's auc: 0.988293	Evaluation's auc: 0.831994
[3399]	Train's auc: 0.988298	Evaluation's auc: 0.832008
[3400]	Train's auc: 0.988301	Evaluation's auc: 0.832005
[3401]	Train's auc: 0.988301	Evaluation's auc: 0.832005
[3402]	Train's auc: 0.988298	Evaluation's auc: 0.832005
[3403]	Train's auc: 0.988296	Evaluation's auc: 0.832009
[3404]	Train's auc: 0.988296	Evaluation's auc: 0.832004
[3405]	Train's auc: 0.988302	Evaluation's auc: 0.832018
[3406]	Train's auc: 0.988306	Evaluation's auc: 0.831993
[3407]	Train's auc: 0.988312	Evaluation's auc: 0.83199
[3408]	Train's auc: 0.988318	Evaluation's auc: 0.831974
[3409]	Train's auc: 0.988323	Evaluation's auc: 0.83195
[3410]	Train's auc: 0.988327	Evaluation's auc: 0.831947
[3411]	Train's auc: 0.988334	Evaluation's auc: 0.831946
[3412]	Train's auc: 0.988342	Evaluation's auc: 0.831941
[3413]	Train's auc: 0.988342	Evaluation's auc: 0.83194
[3414]	Train's auc: 0.988345	Evaluation's auc: 0.831941
[3415]	Train's auc: 0.988349	Evaluation's auc: 0.831947
[3416]	Train's auc: 0.98836	Evaluation's auc: 0.831938
[3417]	Train's auc: 0.988357	Evaluation's auc: 0.831918
[3418]	Train's auc: 0.988355	Evaluation's auc: 0.831936
[3419]	Train's auc: 0.98835	Evaluation's auc: 0.831936
[3420]	Train's auc: 0.988358	Evaluation's auc: 0.831931
[3421]	Train's auc: 0.988365	Evaluation's auc: 0.83191
[3422]	Train's auc: 0.988369	Evaluation's auc: 0.831901
[3423]	Train's auc: 0.988375	Evaluation's auc: 0.831901
[3424]	Train's auc: 0.988371	Evaluation's auc: 0.831888
[3425]	Train's auc: 0.988378	Evaluation's auc: 0.831882
[3426]	Train's auc: 0.988379	Evaluation's auc: 0.831883
[3427]	Train's auc: 0.988387	Evaluation's auc: 0.831899
[3428]	Train's auc: 0.988397	Evaluation's auc: 0.831878
[3429]	Train's auc: 0.988397	Evaluation's auc: 0.83187
[3430]	Train's auc: 0.988397	Evaluation's auc: 0.831869
[3431]	Train's auc: 0.9884	Evaluation's auc: 0.831877
[3432]	Train's auc: 0.988406	Evaluation's auc: 0.831877
[3433]	Train's auc: 0.98841	Evaluation's auc: 0.831868
[3434]	Train's auc: 0.988412	Evaluation's auc: 0.831862
[3435]	Train's auc: 0.988415	Evaluation's auc: 0.831867
[3436]	Train's auc: 0.988415	Evaluation's auc: 0.831867
[3437]	Train's auc: 0.988418	Evaluation's auc: 0.83185
[3438]	Train's auc: 0.988426	Evaluation's auc: 0.831809
[3439]	Train's auc: 0.988428	Evaluation's auc: 0.831821
[3440]	Train's auc: 0.988428	Evaluation's auc: 0.831822
[3441]	Train's auc: 0.988439	Evaluation's auc: 0.83181
[3442]	Train's auc: 0.988439	Evaluation's auc: 0.831805
[3443]	Train's auc: 0.988439	Evaluation's auc: 0.831813
[3444]	Train's auc: 0.988443	Evaluation's auc: 0.831795
[3445]	Train's auc: 0.988446	Evaluation's auc: 0.831774
[3446]	Train's auc: 0.988449	Evaluation's auc: 0.831752
[3447]	Train's auc: 0.98845	Evaluation's auc: 0.831738
[3448]	Train's auc: 0.988453	Evaluation's auc: 0.831721
[3449]	Train's auc: 0.988458	Evaluation's auc: 0.831715
[3450]	Train's auc: 0.98846	Evaluation's auc: 0.831709
[3451]	Train's auc: 0.988459	Evaluation's auc: 0.831721
[3452]	Train's auc: 0.988464	Evaluation's auc: 0.831726
[3453]	Train's auc: 0.988476	Evaluation's auc: 0.831737
[3454]	Train's auc: 0.988483	Evaluation's auc: 0.831766
[3455]	Train's auc: 0.988484	Evaluation's auc: 0.831753
[3456]	Train's auc: 0.988489	Evaluation's auc: 0.831747
[3457]	Train's auc: 0.98849	Evaluation's auc: 0.83173
[3458]	Train's auc: 0.988491	Evaluation's auc: 0.831731
[3459]	Train's auc: 0.988497	Evaluation's auc: 0.831717
[3460]	Train's auc: 0.988498	Evaluation's auc: 0.831711
[3461]	Train's auc: 0.988498	Evaluation's auc: 0.831714
[3462]	Train's auc: 0.9885	Evaluation's auc: 0.831697
[3463]	Train's auc: 0.988503	Evaluation's auc: 0.831704
[3464]	Train's auc: 0.988507	Evaluation's auc: 0.831676
[3465]	Train's auc: 0.988508	Evaluation's auc: 0.831688
[3466]	Train's auc: 0.988507	Evaluation's auc: 0.831691
[3467]	Train's auc: 0.988508	Evaluation's auc: 0.831691
[3468]	Train's auc: 0.98851	Evaluation's auc: 0.831695
[3469]	Train's auc: 0.988534	Evaluation's auc: 0.831672
[3470]	Train's auc: 0.988539	Evaluation's auc: 0.831654
[3471]	Train's auc: 0.988542	Evaluation's auc: 0.831642
[3472]	Train's auc: 0.988541	Evaluation's auc: 0.83164
[3473]	Train's auc: 0.988542	Evaluation's auc: 0.831674
[3474]	Train's auc: 0.988543	Evaluation's auc: 0.831685
[3475]	Train's auc: 0.98855	Evaluation's auc: 0.83167
[3476]	Train's auc: 0.98855	Evaluation's auc: 0.831706
[3477]	Train's auc: 0.988551	Evaluation's auc: 0.831708
[3478]	Train's auc: 0.988552	Evaluation's auc: 0.831714
[3479]	Train's auc: 0.988552	Evaluation's auc: 0.831715
[3480]	Train's auc: 0.988554	Evaluation's auc: 0.831723
[3481]	Train's auc: 0.988556	Evaluation's auc: 0.831731
[3482]	Train's auc: 0.988561	Evaluation's auc: 0.831729
[3483]	Train's auc: 0.988565	Evaluation's auc: 0.831715
[3484]	Train's auc: 0.988567	Evaluation's auc: 0.831722
[3485]	Train's auc: 0.988574	Evaluation's auc: 0.831704
[3486]	Train's auc: 0.988573	Evaluation's auc: 0.831707
[3487]	Train's auc: 0.988584	Evaluation's auc: 0.831709
[3488]	Train's auc: 0.988586	Evaluation's auc: 0.831684
[3489]	Train's auc: 0.988589	Evaluation's auc: 0.831684
[3490]	Train's auc: 0.98859	Evaluation's auc: 0.831679
[3491]	Train's auc: 0.98859	Evaluation's auc: 0.831687
[3492]	Train's auc: 0.988593	Evaluation's auc: 0.831671
[3493]	Train's auc: 0.988595	Evaluation's auc: 0.831667
[3494]	Train's auc: 0.988598	Evaluation's auc: 0.831635
[3495]	Train's auc: 0.988602	Evaluation's auc: 0.831612
[3496]	Train's auc: 0.988607	Evaluation's auc: 0.831606
[3497]	Train's auc: 0.988606	Evaluation's auc: 0.831628
[3498]	Train's auc: 0.988615	Evaluation's auc: 0.831617
[3499]	Train's auc: 0.988622	Evaluation's auc: 0.831656
[3500]	Train's auc: 0.988634	Evaluation's auc: 0.831652
[3501]	Train's auc: 0.988638	Evaluation's auc: 0.831646
[3502]	Train's auc: 0.988645	Evaluation's auc: 0.831645
[3503]	Train's auc: 0.988646	Evaluation's auc: 0.831638
[3504]	Train's auc: 0.988651	Evaluation's auc: 0.83162
[3505]	Train's auc: 0.988653	Evaluation's auc: 0.831612
[3506]	Train's auc: 0.988657	Evaluation's auc: 0.831623
[3507]	Train's auc: 0.988662	Evaluation's auc: 0.831633
[3508]	Train's auc: 0.988664	Evaluation's auc: 0.83162
[3509]	Train's auc: 0.988672	Evaluation's auc: 0.831644
[3510]	Train's auc: 0.988675	Evaluation's auc: 0.831635
[3511]	Train's auc: 0.988673	Evaluation's auc: 0.831628
[3512]	Train's auc: 0.988677	Evaluation's auc: 0.831627
[3513]	Train's auc: 0.988681	Evaluation's auc: 0.831646
[3514]	Train's auc: 0.988693	Evaluation's auc: 0.831638
[3515]	Train's auc: 0.988697	Evaluation's auc: 0.831649
[3516]	Train's auc: 0.988699	Evaluation's auc: 0.831669
[3517]	Train's auc: 0.9887	Evaluation's auc: 0.831662
[3518]	Train's auc: 0.988701	Evaluation's auc: 0.831671
[3519]	Train's auc: 0.98871	Evaluation's auc: 0.831656
[3520]	Train's auc: 0.988711	Evaluation's auc: 0.831654
[3521]	Train's auc: 0.988707	Evaluation's auc: 0.831665
[3522]	Train's auc: 0.988707	Evaluation's auc: 0.831664
[3523]	Train's auc: 0.988716	Evaluation's auc: 0.831656
[3524]	Train's auc: 0.988718	Evaluation's auc: 0.831657
[3525]	Train's auc: 0.988722	Evaluation's auc: 0.831682
[3526]	Train's auc: 0.988725	Evaluation's auc: 0.831695
[3527]	Train's auc: 0.988727	Evaluation's auc: 0.831697
[3528]	Train's auc: 0.988739	Evaluation's auc: 0.831693
[3529]	Train's auc: 0.988741	Evaluation's auc: 0.831679
[3530]	Train's auc: 0.988741	Evaluation's auc: 0.831682
[3531]	Train's auc: 0.988743	Evaluation's auc: 0.831678
[3532]	Train's auc: 0.988752	Evaluation's auc: 0.831704
[3533]	Train's auc: 0.988751	Evaluation's auc: 0.831695
[3534]	Train's auc: 0.988751	Evaluation's auc: 0.831698
[3535]	Train's auc: 0.988752	Evaluation's auc: 0.831693
[3536]	Train's auc: 0.98876	Evaluation's auc: 0.831693
[3537]	Train's auc: 0.988762	Evaluation's auc: 0.831689
[3538]	Train's auc: 0.988765	Evaluation's auc: 0.831664
[3539]	Train's auc: 0.988773	Evaluation's auc: 0.831653
[3540]	Train's auc: 0.988771	Evaluation's auc: 0.831648
[3541]	Train's auc: 0.988776	Evaluation's auc: 0.831627
[3542]	Train's auc: 0.988777	Evaluation's auc: 0.831616
[3543]	Train's auc: 0.988777	Evaluation's auc: 0.831614
[3544]	Train's auc: 0.988778	Evaluation's auc: 0.831616
[3545]	Train's auc: 0.988777	Evaluation's auc: 0.831613
[3546]	Train's auc: 0.988779	Evaluation's auc: 0.8316
[3547]	Train's auc: 0.988781	Evaluation's auc: 0.831589
[3548]	Train's auc: 0.988789	Evaluation's auc: 0.831585
[3549]	Train's auc: 0.988782	Evaluation's auc: 0.831595
[3550]	Train's auc: 0.988788	Evaluation's auc: 0.831609
[3551]	Train's auc: 0.988788	Evaluation's auc: 0.831609
[3552]	Train's auc: 0.988787	Evaluation's auc: 0.83161
[3553]	Train's auc: 0.988793	Evaluation's auc: 0.831603
[3554]	Train's auc: 0.988793	Evaluation's auc: 0.831628
[3555]	Train's auc: 0.988793	Evaluation's auc: 0.83162
[3556]	Train's auc: 0.988795	Evaluation's auc: 0.831606
[3557]	Train's auc: 0.988798	Evaluation's auc: 0.831596
[3558]	Train's auc: 0.988807	Evaluation's auc: 0.831584
[3559]	Train's auc: 0.988811	Evaluation's auc: 0.831584
[3560]	Train's auc: 0.988811	Evaluation's auc: 0.831588
[3561]	Train's auc: 0.988814	Evaluation's auc: 0.831551
[3562]	Train's auc: 0.988819	Evaluation's auc: 0.831549
[3563]	Train's auc: 0.988823	Evaluation's auc: 0.83155
[3564]	Train's auc: 0.988826	Evaluation's auc: 0.831538
[3565]	Train's auc: 0.988828	Evaluation's auc: 0.83153
[3566]	Train's auc: 0.988834	Evaluation's auc: 0.831513
[3567]	Train's auc: 0.988831	Evaluation's auc: 0.831558
[3568]	Train's auc: 0.988834	Evaluation's auc: 0.831551
[3569]	Train's auc: 0.988835	Evaluation's auc: 0.831545
[3570]	Train's auc: 0.98884	Evaluation's auc: 0.83156
[3571]	Train's auc: 0.988841	Evaluation's auc: 0.831546
[3572]	Train's auc: 0.988841	Evaluation's auc: 0.831543
[3573]	Train's auc: 0.98885	Evaluation's auc: 0.831533
[3574]	Train's auc: 0.988855	Evaluation's auc: 0.831495
[3575]	Train's auc: 0.988855	Evaluation's auc: 0.831499
[3576]	Train's auc: 0.988851	Evaluation's auc: 0.831496
[3577]	Train's auc: 0.988851	Evaluation's auc: 0.8315
[3578]	Train's auc: 0.988852	Evaluation's auc: 0.831526
[3579]	Train's auc: 0.988854	Evaluation's auc: 0.831531
[3580]	Train's auc: 0.988854	Evaluation's auc: 0.83155
[3581]	Train's auc: 0.988855	Evaluation's auc: 0.831551
[3582]	Train's auc: 0.988857	Evaluation's auc: 0.831562
[3583]	Train's auc: 0.988866	Evaluation's auc: 0.831564
[3584]	Train's auc: 0.988872	Evaluation's auc: 0.83156
[3585]	Train's auc: 0.988874	Evaluation's auc: 0.831587
[3586]	Train's auc: 0.988893	Evaluation's auc: 0.831589
[3587]	Train's auc: 0.988896	Evaluation's auc: 0.831577
[3588]	Train's auc: 0.988901	Evaluation's auc: 0.831606
[3589]	Train's auc: 0.988905	Evaluation's auc: 0.831585
[3590]	Train's auc: 0.988908	Evaluation's auc: 0.831578
[3591]	Train's auc: 0.988907	Evaluation's auc: 0.831589
[3592]	Train's auc: 0.98891	Evaluation's auc: 0.831601
[3593]	Train's auc: 0.98891	Evaluation's auc: 0.831601
[3594]	Train's auc: 0.988914	Evaluation's auc: 0.831603
[3595]	Train's auc: 0.988914	Evaluation's auc: 0.831605
[3596]	Train's auc: 0.988914	Evaluation's auc: 0.831609
[3597]	Train's auc: 0.988913	Evaluation's auc: 0.831605
[3598]	Train's auc: 0.988917	Evaluation's auc: 0.831613
[3599]	Train's auc: 0.988918	Evaluation's auc: 0.831609
[3600]	Train's auc: 0.988922	Evaluation's auc: 0.831598
[3601]	Train's auc: 0.988924	Evaluation's auc: 0.831597
[3602]	Train's auc: 0.988928	Evaluation's auc: 0.831596
[3603]	Train's auc: 0.988929	Evaluation's auc: 0.831587
[3604]	Train's auc: 0.988931	Evaluation's auc: 0.831554
[3605]	Train's auc: 0.988937	Evaluation's auc: 0.831554
[3606]	Train's auc: 0.98894	Evaluation's auc: 0.83155
[3607]	Train's auc: 0.988946	Evaluation's auc: 0.831538
[3608]	Train's auc: 0.988948	Evaluation's auc: 0.831541
[3609]	Train's auc: 0.988949	Evaluation's auc: 0.831529
[3610]	Train's auc: 0.988954	Evaluation's auc: 0.831543
[3611]	Train's auc: 0.988954	Evaluation's auc: 0.831543
[3612]	Train's auc: 0.988954	Evaluation's auc: 0.831534
[3613]	Train's auc: 0.988955	Evaluation's auc: 0.83153
[3614]	Train's auc: 0.98896	Evaluation's auc: 0.831528
[3615]	Train's auc: 0.988962	Evaluation's auc: 0.831524
[3616]	Train's auc: 0.988961	Evaluation's auc: 0.831534
[3617]	Train's auc: 0.988965	Evaluation's auc: 0.83153
[3618]	Train's auc: 0.988966	Evaluation's auc: 0.831524
[3619]	Train's auc: 0.988968	Evaluation's auc: 0.831522
[3620]	Train's auc: 0.988969	Evaluation's auc: 0.831523
[3621]	Train's auc: 0.988972	Evaluation's auc: 0.831519
[3622]	Train's auc: 0.988984	Evaluation's auc: 0.83152
[3623]	Train's auc: 0.988983	Evaluation's auc: 0.831519
[3624]	Train's auc: 0.988987	Evaluation's auc: 0.831523
[3625]	Train's auc: 0.988985	Evaluation's auc: 0.831526
[3626]	Train's auc: 0.988988	Evaluation's auc: 0.831522
[3627]	Train's auc: 0.988991	Evaluation's auc: 0.831516
[3628]	Train's auc: 0.988996	Evaluation's auc: 0.83152
[3629]	Train's auc: 0.988997	Evaluation's auc: 0.83151
[3630]	Train's auc: 0.989002	Evaluation's auc: 0.831529
[3631]	Train's auc: 0.989003	Evaluation's auc: 0.831528
[3632]	Train's auc: 0.989007	Evaluation's auc: 0.831514
[3633]	Train's auc: 0.989008	Evaluation's auc: 0.831515
[3634]	Train's auc: 0.989005	Evaluation's auc: 0.831514
[3635]	Train's auc: 0.989006	Evaluation's auc: 0.831506
[3636]	Train's auc: 0.989007	Evaluation's auc: 0.831509
[3637]	Train's auc: 0.989008	Evaluation's auc: 0.831527
[3638]	Train's auc: 0.989014	Evaluation's auc: 0.831513
[3639]	Train's auc: 0.989015	Evaluation's auc: 0.831488
[3640]	Train's auc: 0.989017	Evaluation's auc: 0.831481
[3641]	Train's auc: 0.989022	Evaluation's auc: 0.831477
[3642]	Train's auc: 0.989022	Evaluation's auc: 0.831477
[3643]	Train's auc: 0.989025	Evaluation's auc: 0.831479
[3644]	Train's auc: 0.989035	Evaluation's auc: 0.831485
[3645]	Train's auc: 0.989035	Evaluation's auc: 0.831481
[3646]	Train's auc: 0.989034	Evaluation's auc: 0.831492
[3647]	Train's auc: 0.989036	Evaluation's auc: 0.831496
[3648]	Train's auc: 0.989039	Evaluation's auc: 0.831487
[3649]	Train's auc: 0.989041	Evaluation's auc: 0.83146
[3650]	Train's auc: 0.98904	Evaluation's auc: 0.831477
[3651]	Train's auc: 0.98904	Evaluation's auc: 0.83148
[3652]	Train's auc: 0.989043	Evaluation's auc: 0.831469
[3653]	Train's auc: 0.989046	Evaluation's auc: 0.831468
[3654]	Train's auc: 0.989046	Evaluation's auc: 0.83147
[3655]	Train's auc: 0.989047	Evaluation's auc: 0.831471
[3656]	Train's auc: 0.989048	Evaluation's auc: 0.831477
[3657]	Train's auc: 0.989049	Evaluation's auc: 0.831461
[3658]	Train's auc: 0.98905	Evaluation's auc: 0.831461
[3659]	Train's auc: 0.989049	Evaluation's auc: 0.83145
[3660]	Train's auc: 0.989052	Evaluation's auc: 0.831468
[3661]	Train's auc: 0.989057	Evaluation's auc: 0.831454
[3662]	Train's auc: 0.989057	Evaluation's auc: 0.831453
[3663]	Train's auc: 0.989059	Evaluation's auc: 0.83144
[3664]	Train's auc: 0.98906	Evaluation's auc: 0.831421
[3665]	Train's auc: 0.989063	Evaluation's auc: 0.831425
[3666]	Train's auc: 0.989075	Evaluation's auc: 0.831424
[3667]	Train's auc: 0.989075	Evaluation's auc: 0.831423
[3668]	Train's auc: 0.989078	Evaluation's auc: 0.83144
[3669]	Train's auc: 0.989084	Evaluation's auc: 0.831443
[3670]	Train's auc: 0.989091	Evaluation's auc: 0.831444
[3671]	Train's auc: 0.989094	Evaluation's auc: 0.831431
[3672]	Train's auc: 0.989097	Evaluation's auc: 0.831419
[3673]	Train's auc: 0.989097	Evaluation's auc: 0.831418
[3674]	Train's auc: 0.989099	Evaluation's auc: 0.831401
[3675]	Train's auc: 0.989105	Evaluation's auc: 0.831392
[3676]	Train's auc: 0.989106	Evaluation's auc: 0.831391
[3677]	Train's auc: 0.989109	Evaluation's auc: 0.831382
[3678]	Train's auc: 0.989111	Evaluation's auc: 0.831347
[3679]	Train's auc: 0.989112	Evaluation's auc: 0.831348
[3680]	Train's auc: 0.989113	Evaluation's auc: 0.831354
[3681]	Train's auc: 0.989114	Evaluation's auc: 0.831357
[3682]	Train's auc: 0.989116	Evaluation's auc: 0.831346
[3683]	Train's auc: 0.989116	Evaluation's auc: 0.831343
[3684]	Train's auc: 0.989121	Evaluation's auc: 0.831331
[3685]	Train's auc: 0.989124	Evaluation's auc: 0.831339
[3686]	Train's auc: 0.989128	Evaluation's auc: 0.831332
[3687]	Train's auc: 0.989135	Evaluation's auc: 0.831336
[3688]	Train's auc: 0.989143	Evaluation's auc: 0.831317
[3689]	Train's auc: 0.989144	Evaluation's auc: 0.831315
[3690]	Train's auc: 0.989147	Evaluation's auc: 0.831325
[3691]	Train's auc: 0.989148	Evaluation's auc: 0.83131
[3692]	Train's auc: 0.989148	Evaluation's auc: 0.831299
[3693]	Train's auc: 0.989148	Evaluation's auc: 0.831312
[3694]	Train's auc: 0.989148	Evaluation's auc: 0.831307
[3695]	Train's auc: 0.989154	Evaluation's auc: 0.831322
[3696]	Train's auc: 0.989158	Evaluation's auc: 0.831329
[3697]	Train's auc: 0.989154	Evaluation's auc: 0.831334
[3698]	Train's auc: 0.989156	Evaluation's auc: 0.831313
[3699]	Train's auc: 0.98916	Evaluation's auc: 0.831319
[3700]	Train's auc: 0.989162	Evaluation's auc: 0.831319
[3701]	Train's auc: 0.989163	Evaluation's auc: 0.831315
[3702]	Train's auc: 0.989163	Evaluation's auc: 0.831323
[3703]	Train's auc: 0.989164	Evaluation's auc: 0.831321
[3704]	Train's auc: 0.989166	Evaluation's auc: 0.831301
[3705]	Train's auc: 0.989169	Evaluation's auc: 0.831276
[3706]	Train's auc: 0.98917	Evaluation's auc: 0.831274
[3707]	Train's auc: 0.989171	Evaluation's auc: 0.831273
[3708]	Train's auc: 0.989173	Evaluation's auc: 0.831265
[3709]	Train's auc: 0.989173	Evaluation's auc: 0.831262
[3710]	Train's auc: 0.989174	Evaluation's auc: 0.831263
[3711]	Train's auc: 0.989174	Evaluation's auc: 0.831263
[3712]	Train's auc: 0.989173	Evaluation's auc: 0.831263
[3713]	Train's auc: 0.989175	Evaluation's auc: 0.831272
[3714]	Train's auc: 0.989182	Evaluation's auc: 0.831255
[3715]	Train's auc: 0.989182	Evaluation's auc: 0.831253
[3716]	Train's auc: 0.989182	Evaluation's auc: 0.831253
[3717]	Train's auc: 0.989183	Evaluation's auc: 0.831242
[3718]	Train's auc: 0.989185	Evaluation's auc: 0.831226
[3719]	Train's auc: 0.989187	Evaluation's auc: 0.831216
[3720]	Train's auc: 0.989188	Evaluation's auc: 0.831222
[3721]	Train's auc: 0.98919	Evaluation's auc: 0.831212
[3722]	Train's auc: 0.98919	Evaluation's auc: 0.831212
[3723]	Train's auc: 0.989191	Evaluation's auc: 0.831221
[3724]	Train's auc: 0.989194	Evaluation's auc: 0.831221
[3725]	Train's auc: 0.989197	Evaluation's auc: 0.83122
[3726]	Train's auc: 0.989199	Evaluation's auc: 0.831215
[3727]	Train's auc: 0.989207	Evaluation's auc: 0.831224
[3728]	Train's auc: 0.98921	Evaluation's auc: 0.831221
[3729]	Train's auc: 0.98921	Evaluation's auc: 0.831217
[3730]	Train's auc: 0.989217	Evaluation's auc: 0.831234
[3731]	Train's auc: 0.989217	Evaluation's auc: 0.83123
[3732]	Train's auc: 0.989217	Evaluation's auc: 0.831236
[3733]	Train's auc: 0.989217	Evaluation's auc: 0.831236
[3734]	Train's auc: 0.989217	Evaluation's auc: 0.831236
[3735]	Train's auc: 0.989217	Evaluation's auc: 0.831233
[3736]	Train's auc: 0.989217	Evaluation's auc: 0.831246
[3737]	Train's auc: 0.989219	Evaluation's auc: 0.831225
[3738]	Train's auc: 0.989221	Evaluation's auc: 0.831224
[3739]	Train's auc: 0.98922	Evaluation's auc: 0.831216
[3740]	Train's auc: 0.989225	Evaluation's auc: 0.831196
[3741]	Train's auc: 0.989221	Evaluation's auc: 0.831189
[3742]	Train's auc: 0.98923	Evaluation's auc: 0.831167
[3743]	Train's auc: 0.98923	Evaluation's auc: 0.831163
[3744]	Train's auc: 0.989232	Evaluation's auc: 0.831163
[3745]	Train's auc: 0.989232	Evaluation's auc: 0.831163
[3746]	Train's auc: 0.989235	Evaluation's auc: 0.831186
[3747]	Train's auc: 0.989237	Evaluation's auc: 0.831182
[3748]	Train's auc: 0.989238	Evaluation's auc: 0.831183
[3749]	Train's auc: 0.989244	Evaluation's auc: 0.831175
[3750]	Train's auc: 0.989247	Evaluation's auc: 0.831185
[3751]	Train's auc: 0.989247	Evaluation's auc: 0.831183
[3752]	Train's auc: 0.989251	Evaluation's auc: 0.831184
[3753]	Train's auc: 0.989249	Evaluation's auc: 0.831184
[3754]	Train's auc: 0.989252	Evaluation's auc: 0.831164
[3755]	Train's auc: 0.989252	Evaluation's auc: 0.83117
[3756]	Train's auc: 0.98925	Evaluation's auc: 0.831189
[3757]	Train's auc: 0.989254	Evaluation's auc: 0.831194
[3758]	Train's auc: 0.989255	Evaluation's auc: 0.831218
[3759]	Train's auc: 0.989254	Evaluation's auc: 0.831226
[3760]	Train's auc: 0.989268	Evaluation's auc: 0.831215
[3761]	Train's auc: 0.989268	Evaluation's auc: 0.831208
[3762]	Train's auc: 0.989268	Evaluation's auc: 0.831212
[3763]	Train's auc: 0.989268	Evaluation's auc: 0.831211
[3764]	Train's auc: 0.989271	Evaluation's auc: 0.831209
[3765]	Train's auc: 0.989271	Evaluation's auc: 0.831213
[3766]	Train's auc: 0.98927	Evaluation's auc: 0.831215
[3767]	Train's auc: 0.98927	Evaluation's auc: 0.831216
[3768]	Train's auc: 0.98927	Evaluation's auc: 0.831212
[3769]	Train's auc: 0.989271	Evaluation's auc: 0.831224
[3770]	Train's auc: 0.989269	Evaluation's auc: 0.831223
[3771]	Train's auc: 0.989268	Evaluation's auc: 0.831227
[3772]	Train's auc: 0.989271	Evaluation's auc: 0.831221
[3773]	Train's auc: 0.989277	Evaluation's auc: 0.831216
[3774]	Train's auc: 0.98928	Evaluation's auc: 0.831229
[3775]	Train's auc: 0.989282	Evaluation's auc: 0.831244
[3776]	Train's auc: 0.989282	Evaluation's auc: 0.831226
[3777]	Train's auc: 0.989283	Evaluation's auc: 0.831219
[3778]	Train's auc: 0.989288	Evaluation's auc: 0.831218
[3779]	Train's auc: 0.989293	Evaluation's auc: 0.831221
[3780]	Train's auc: 0.989293	Evaluation's auc: 0.831211
[3781]	Train's auc: 0.989293	Evaluation's auc: 0.831212
[3782]	Train's auc: 0.989295	Evaluation's auc: 0.831198
[3783]	Train's auc: 0.989296	Evaluation's auc: 0.831191
[3784]	Train's auc: 0.989298	Evaluation's auc: 0.831189
[3785]	Train's auc: 0.989295	Evaluation's auc: 0.831225
[3786]	Train's auc: 0.989295	Evaluation's auc: 0.831224
[3787]	Train's auc: 0.9893	Evaluation's auc: 0.831235
[3788]	Train's auc: 0.989302	Evaluation's auc: 0.831226
[3789]	Train's auc: 0.989303	Evaluation's auc: 0.83123
[3790]	Train's auc: 0.989303	Evaluation's auc: 0.83123
[3791]	Train's auc: 0.989305	Evaluation's auc: 0.831232
[3792]	Train's auc: 0.989308	Evaluation's auc: 0.831215
[3793]	Train's auc: 0.98931	Evaluation's auc: 0.831224
[3794]	Train's auc: 0.989311	Evaluation's auc: 0.831223
[3795]	Train's auc: 0.989311	Evaluation's auc: 0.831224
[3796]	Train's auc: 0.98931	Evaluation's auc: 0.831217
[3797]	Train's auc: 0.989315	Evaluation's auc: 0.831213
[3798]	Train's auc: 0.989315	Evaluation's auc: 0.831211
[3799]	Train's auc: 0.989316	Evaluation's auc: 0.831205
[3800]	Train's auc: 0.98932	Evaluation's auc: 0.831197
[3801]	Train's auc: 0.989323	Evaluation's auc: 0.831205
[3802]	Train's auc: 0.989325	Evaluation's auc: 0.831204
[3803]	Train's auc: 0.989326	Evaluation's auc: 0.831199
[3804]	Train's auc: 0.989328	Evaluation's auc: 0.831193
[3805]	Train's auc: 0.989331	Evaluation's auc: 0.831191
[3806]	Train's auc: 0.989336	Evaluation's auc: 0.831174
[3807]	Train's auc: 0.989338	Evaluation's auc: 0.831176
[3808]	Train's auc: 0.989337	Evaluation's auc: 0.83118
[3809]	Train's auc: 0.989338	Evaluation's auc: 0.831177
[3810]	Train's auc: 0.98934	Evaluation's auc: 0.831158
[3811]	Train's auc: 0.989344	Evaluation's auc: 0.831155
[3812]	Train's auc: 0.989346	Evaluation's auc: 0.831154
[3813]	Train's auc: 0.989352	Evaluation's auc: 0.831157
[3814]	Train's auc: 0.989357	Evaluation's auc: 0.831166
[3815]	Train's auc: 0.989362	Evaluation's auc: 0.831145
[3816]	Train's auc: 0.98937	Evaluation's auc: 0.831149
[3817]	Train's auc: 0.989372	Evaluation's auc: 0.831153
[3818]	Train's auc: 0.989376	Evaluation's auc: 0.83116
[3819]	Train's auc: 0.989378	Evaluation's auc: 0.831157
[3820]	Train's auc: 0.98938	Evaluation's auc: 0.831144
[3821]	Train's auc: 0.989391	Evaluation's auc: 0.831116
[3822]	Train's auc: 0.989391	Evaluation's auc: 0.831131
[3823]	Train's auc: 0.98939	Evaluation's auc: 0.831132
[3824]	Train's auc: 0.989391	Evaluation's auc: 0.831136
[3825]	Train's auc: 0.989391	Evaluation's auc: 0.831136
[3826]	Train's auc: 0.989391	Evaluation's auc: 0.831136
[3827]	Train's auc: 0.989395	Evaluation's auc: 0.831129
[3828]	Train's auc: 0.989395	Evaluation's auc: 0.831132
[3829]	Train's auc: 0.989399	Evaluation's auc: 0.831116
[3830]	Train's auc: 0.989396	Evaluation's auc: 0.831133
[3831]	Train's auc: 0.989401	Evaluation's auc: 0.831127
[3832]	Train's auc: 0.989403	Evaluation's auc: 0.831117
[3833]	Train's auc: 0.989403	Evaluation's auc: 0.831095
[3834]	Train's auc: 0.989404	Evaluation's auc: 0.8311
[3835]	Train's auc: 0.989403	Evaluation's auc: 0.831104
[3836]	Train's auc: 0.989404	Evaluation's auc: 0.831101
[3837]	Train's auc: 0.989407	Evaluation's auc: 0.831104
[3838]	Train's auc: 0.989413	Evaluation's auc: 0.831101
[3839]	Train's auc: 0.989413	Evaluation's auc: 0.831104
[3840]	Train's auc: 0.989417	Evaluation's auc: 0.831125
[3841]	Train's auc: 0.989427	Evaluation's auc: 0.831128
[3842]	Train's auc: 0.98943	Evaluation's auc: 0.831123
[3843]	Train's auc: 0.989437	Evaluation's auc: 0.83111
[3844]	Train's auc: 0.989443	Evaluation's auc: 0.831099
[3845]	Train's auc: 0.989439	Evaluation's auc: 0.831108
[3846]	Train's auc: 0.989439	Evaluation's auc: 0.831106
[3847]	Train's auc: 0.98944	Evaluation's auc: 0.83111
[3848]	Train's auc: 0.989437	Evaluation's auc: 0.831127
[3849]	Train's auc: 0.989437	Evaluation's auc: 0.831126
[3850]	Train's auc: 0.989438	Evaluation's auc: 0.831125
[3851]	Train's auc: 0.98944	Evaluation's auc: 0.831137
[3852]	Train's auc: 0.989441	Evaluation's auc: 0.831135
[3853]	Train's auc: 0.989442	Evaluation's auc: 0.831132
[3854]	Train's auc: 0.989445	Evaluation's auc: 0.831127
[3855]	Train's auc: 0.989443	Evaluation's auc: 0.831132
[3856]	Train's auc: 0.989445	Evaluation's auc: 0.831142
[3857]	Train's auc: 0.989445	Evaluation's auc: 0.831142
[3858]	Train's auc: 0.989447	Evaluation's auc: 0.831146
[3859]	Train's auc: 0.989451	Evaluation's auc: 0.831138
[3860]	Train's auc: 0.98945	Evaluation's auc: 0.831144
[3861]	Train's auc: 0.989453	Evaluation's auc: 0.831136
[3862]	Train's auc: 0.989455	Evaluation's auc: 0.831122
[3863]	Train's auc: 0.989457	Evaluation's auc: 0.831118
[3864]	Train's auc: 0.989457	Evaluation's auc: 0.831121
[3865]	Train's auc: 0.98946	Evaluation's auc: 0.831122
[3866]	Train's auc: 0.989461	Evaluation's auc: 0.831116
[3867]	Train's auc: 0.989466	Evaluation's auc: 0.831127
[3868]	Train's auc: 0.989472	Evaluation's auc: 0.831107
[3869]	Train's auc: 0.989471	Evaluation's auc: 0.8311
[3870]	Train's auc: 0.989471	Evaluation's auc: 0.831087
[3871]	Train's auc: 0.989476	Evaluation's auc: 0.831091
[3872]	Train's auc: 0.989481	Evaluation's auc: 0.831089
[3873]	Train's auc: 0.989488	Evaluation's auc: 0.83109
[3874]	Train's auc: 0.989488	Evaluation's auc: 0.831088
[3875]	Train's auc: 0.989489	Evaluation's auc: 0.831083
[3876]	Train's auc: 0.98949	Evaluation's auc: 0.831089
[3877]	Train's auc: 0.989491	Evaluation's auc: 0.831074
[3878]	Train's auc: 0.989487	Evaluation's auc: 0.831075
[3879]	Train's auc: 0.989492	Evaluation's auc: 0.831057
[3880]	Train's auc: 0.989493	Evaluation's auc: 0.831049
[3881]	Train's auc: 0.989498	Evaluation's auc: 0.831034
[3882]	Train's auc: 0.989501	Evaluation's auc: 0.83104
[3883]	Train's auc: 0.989504	Evaluation's auc: 0.83104
[3884]	Train's auc: 0.989506	Evaluation's auc: 0.831023
[3885]	Train's auc: 0.989506	Evaluation's auc: 0.831022
[3886]	Train's auc: 0.989506	Evaluation's auc: 0.831022
[3887]	Train's auc: 0.989507	Evaluation's auc: 0.831041
[3888]	Train's auc: 0.98952	Evaluation's auc: 0.831039
[3889]	Train's auc: 0.98952	Evaluation's auc: 0.831037
[3890]	Train's auc: 0.989521	Evaluation's auc: 0.831041
[3891]	Train's auc: 0.989526	Evaluation's auc: 0.83104
[3892]	Train's auc: 0.989526	Evaluation's auc: 0.83104
[3893]	Train's auc: 0.989523	Evaluation's auc: 0.831018
[3894]	Train's auc: 0.989525	Evaluation's auc: 0.830999
[3895]	Train's auc: 0.989528	Evaluation's auc: 0.830997
[3896]	Train's auc: 0.989526	Evaluation's auc: 0.831003
[3897]	Train's auc: 0.989522	Evaluation's auc: 0.831016
[3898]	Train's auc: 0.989525	Evaluation's auc: 0.83101
[3899]	Train's auc: 0.989531	Evaluation's auc: 0.831014
[3900]	Train's auc: 0.989534	Evaluation's auc: 0.831002
[3901]	Train's auc: 0.989536	Evaluation's auc: 0.830979
[3902]	Train's auc: 0.989538	Evaluation's auc: 0.830977
[3903]	Train's auc: 0.989541	Evaluation's auc: 0.831023
[3904]	Train's auc: 0.989543	Evaluation's auc: 0.83103
[3905]	Train's auc: 0.989543	Evaluation's auc: 0.83103
[3906]	Train's auc: 0.989543	Evaluation's auc: 0.83103
[3907]	Train's auc: 0.989543	Evaluation's auc: 0.831031
[3908]	Train's auc: 0.989545	Evaluation's auc: 0.83102
[3909]	Train's auc: 0.98955	Evaluation's auc: 0.831032
[3910]	Train's auc: 0.98955	Evaluation's auc: 0.831033
[3911]	Train's auc: 0.989555	Evaluation's auc: 0.831035
[3912]	Train's auc: 0.989558	Evaluation's auc: 0.831036
[3913]	Train's auc: 0.989558	Evaluation's auc: 0.831036
[3914]	Train's auc: 0.989559	Evaluation's auc: 0.831032
[3915]	Train's auc: 0.989557	Evaluation's auc: 0.831019
[3916]	Train's auc: 0.989557	Evaluation's auc: 0.831018
[3917]	Train's auc: 0.989557	Evaluation's auc: 0.831018
[3918]	Train's auc: 0.989557	Evaluation's auc: 0.831012
[3919]	Train's auc: 0.989557	Evaluation's auc: 0.831014
[3920]	Train's auc: 0.989559	Evaluation's auc: 0.83103
[3921]	Train's auc: 0.989561	Evaluation's auc: 0.831004
[3922]	Train's auc: 0.989562	Evaluation's auc: 0.830998
[3923]	Train's auc: 0.989563	Evaluation's auc: 0.830991
[3924]	Train's auc: 0.989567	Evaluation's auc: 0.830976
[3925]	Train's auc: 0.989569	Evaluation's auc: 0.830977
[3926]	Train's auc: 0.989571	Evaluation's auc: 0.830964
[3927]	Train's auc: 0.989577	Evaluation's auc: 0.830941
[3928]	Train's auc: 0.989577	Evaluation's auc: 0.830941
[3929]	Train's auc: 0.989574	Evaluation's auc: 0.830934
[3930]	Train's auc: 0.989573	Evaluation's auc: 0.830938
[3931]	Train's auc: 0.989576	Evaluation's auc: 0.830929
[3932]	Train's auc: 0.989577	Evaluation's auc: 0.830927
[3933]	Train's auc: 0.98958	Evaluation's auc: 0.83092
[3934]	Train's auc: 0.989582	Evaluation's auc: 0.830915
[3935]	Train's auc: 0.989584	Evaluation's auc: 0.830905
[3936]	Train's auc: 0.989585	Evaluation's auc: 0.830903
[3937]	Train's auc: 0.989585	Evaluation's auc: 0.830901
[3938]	Train's auc: 0.98959	Evaluation's auc: 0.830912
[3939]	Train's auc: 0.989589	Evaluation's auc: 0.830942
[3940]	Train's auc: 0.989593	Evaluation's auc: 0.830925
[3941]	Train's auc: 0.9896	Evaluation's auc: 0.830926
[3942]	Train's auc: 0.989609	Evaluation's auc: 0.830943
[3943]	Train's auc: 0.989607	Evaluation's auc: 0.830951
[3944]	Train's auc: 0.989611	Evaluation's auc: 0.830952
[3945]	Train's auc: 0.989613	Evaluation's auc: 0.830948
[3946]	Train's auc: 0.98962	Evaluation's auc: 0.830948
[3947]	Train's auc: 0.989622	Evaluation's auc: 0.83094
[3948]	Train's auc: 0.989628	Evaluation's auc: 0.83097
[3949]	Train's auc: 0.98963	Evaluation's auc: 0.830958
[3950]	Train's auc: 0.989633	Evaluation's auc: 0.830951
[3951]	Train's auc: 0.989633	Evaluation's auc: 0.830951
[3952]	Train's auc: 0.989636	Evaluation's auc: 0.830963
[3953]	Train's auc: 0.989636	Evaluation's auc: 0.830963
[3954]	Train's auc: 0.989635	Evaluation's auc: 0.830964
[3955]	Train's auc: 0.989636	Evaluation's auc: 0.830966
[3956]	Train's auc: 0.989639	Evaluation's auc: 0.830959
[3957]	Train's auc: 0.98964	Evaluation's auc: 0.83097
[3958]	Train's auc: 0.989642	Evaluation's auc: 0.830966
[3959]	Train's auc: 0.989643	Evaluation's auc: 0.830965
[3960]	Train's auc: 0.989644	Evaluation's auc: 0.830969
[3961]	Train's auc: 0.989644	Evaluation's auc: 0.83098
[3962]	Train's auc: 0.989641	Evaluation's auc: 0.83097
[3963]	Train's auc: 0.989644	Evaluation's auc: 0.830962
[3964]	Train's auc: 0.989646	Evaluation's auc: 0.830952
[3965]	Train's auc: 0.989653	Evaluation's auc: 0.830952
[3966]	Train's auc: 0.989653	Evaluation's auc: 0.830954
[3967]	Train's auc: 0.989654	Evaluation's auc: 0.830942
[3968]	Train's auc: 0.989654	Evaluation's auc: 0.830942
[3969]	Train's auc: 0.989654	Evaluation's auc: 0.83097
[3970]	Train's auc: 0.989655	Evaluation's auc: 0.83098
[3971]	Train's auc: 0.989658	Evaluation's auc: 0.83097
[3972]	Train's auc: 0.989659	Evaluation's auc: 0.830966
[3973]	Train's auc: 0.989663	Evaluation's auc: 0.830963
[3974]	Train's auc: 0.989663	Evaluation's auc: 0.830967
[3975]	Train's auc: 0.989667	Evaluation's auc: 0.830961
[3976]	Train's auc: 0.989674	Evaluation's auc: 0.830963
[3977]	Train's auc: 0.989671	Evaluation's auc: 0.830959
[3978]	Train's auc: 0.989673	Evaluation's auc: 0.830953
[3979]	Train's auc: 0.989668	Evaluation's auc: 0.830964
[3980]	Train's auc: 0.989669	Evaluation's auc: 0.830966
[3981]	Train's auc: 0.989672	Evaluation's auc: 0.830952
[3982]	Train's auc: 0.989674	Evaluation's auc: 0.830935
[3983]	Train's auc: 0.989675	Evaluation's auc: 0.830926
[3984]	Train's auc: 0.989678	Evaluation's auc: 0.830934
[3985]	Train's auc: 0.989678	Evaluation's auc: 0.830946
[3986]	Train's auc: 0.989678	Evaluation's auc: 0.830951
[3987]	Train's auc: 0.989679	Evaluation's auc: 0.830958
[3988]	Train's auc: 0.989685	Evaluation's auc: 0.830947
[3989]	Train's auc: 0.989688	Evaluation's auc: 0.830938
[3990]	Train's auc: 0.989689	Evaluation's auc: 0.830945
[3991]	Train's auc: 0.989693	Evaluation's auc: 0.830965
[3992]	Train's auc: 0.989698	Evaluation's auc: 0.830964
[3993]	Train's auc: 0.989698	Evaluation's auc: 0.830964
[3994]	Train's auc: 0.989699	Evaluation's auc: 0.830954
[3995]	Train's auc: 0.989699	Evaluation's auc: 0.830953
[3996]	Train's auc: 0.9897	Evaluation's auc: 0.830952
[3997]	Train's auc: 0.9897	Evaluation's auc: 0.830953
[3998]	Train's auc: 0.9897	Evaluation's auc: 0.830953
[3999]	Train's auc: 0.989704	Evaluation's auc: 0.830945
[4000]	Train's auc: 0.989709	Evaluation's auc: 0.830965
[4001]	Train's auc: 0.989709	Evaluation's auc: 0.830959
[4002]	Train's auc: 0.989709	Evaluation's auc: 0.830957
[4003]	Train's auc: 0.989711	Evaluation's auc: 0.830962
[4004]	Train's auc: 0.989711	Evaluation's auc: 0.830964
[4005]	Train's auc: 0.989711	Evaluation's auc: 0.830964
[4006]	Train's auc: 0.989713	Evaluation's auc: 0.830965
[4007]	Train's auc: 0.989716	Evaluation's auc: 0.830958
[4008]	Train's auc: 0.989719	Evaluation's auc: 0.830963
[4009]	Train's auc: 0.989728	Evaluation's auc: 0.830987
[4010]	Train's auc: 0.98973	Evaluation's auc: 0.830974
[4011]	Train's auc: 0.989733	Evaluation's auc: 0.830976
[4012]	Train's auc: 0.989734	Evaluation's auc: 0.830979
[4013]	Train's auc: 0.989734	Evaluation's auc: 0.830979
[4014]	Train's auc: 0.989735	Evaluation's auc: 0.830973
[4015]	Train's auc: 0.98974	Evaluation's auc: 0.830973
[4016]	Train's auc: 0.989741	Evaluation's auc: 0.830974
[4017]	Train's auc: 0.98974	Evaluation's auc: 0.830971
[4018]	Train's auc: 0.989742	Evaluation's auc: 0.830954
[4019]	Train's auc: 0.989744	Evaluation's auc: 0.830947
[4020]	Train's auc: 0.989744	Evaluation's auc: 0.830946
[4021]	Train's auc: 0.98974	Evaluation's auc: 0.830951
[4022]	Train's auc: 0.989741	Evaluation's auc: 0.830958
[4023]	Train's auc: 0.989742	Evaluation's auc: 0.830959
[4024]	Train's auc: 0.989742	Evaluation's auc: 0.830957
[4025]	Train's auc: 0.989748	Evaluation's auc: 0.83094
[4026]	Train's auc: 0.989748	Evaluation's auc: 0.83094
[4027]	Train's auc: 0.989752	Evaluation's auc: 0.830915
[4028]	Train's auc: 0.989757	Evaluation's auc: 0.830919
[4029]	Train's auc: 0.989757	Evaluation's auc: 0.830917
[4030]	Train's auc: 0.98976	Evaluation's auc: 0.830916
[4031]	Train's auc: 0.989764	Evaluation's auc: 0.830916
[4032]	Train's auc: 0.989761	Evaluation's auc: 0.830911
[4033]	Train's auc: 0.989763	Evaluation's auc: 0.83089
[4034]	Train's auc: 0.989764	Evaluation's auc: 0.830886
[4035]	Train's auc: 0.989767	Evaluation's auc: 0.830881
[4036]	Train's auc: 0.989768	Evaluation's auc: 0.830879
[4037]	Train's auc: 0.989768	Evaluation's auc: 0.830884
[4038]	Train's auc: 0.98977	Evaluation's auc: 0.830884
[4039]	Train's auc: 0.989773	Evaluation's auc: 0.830875
[4040]	Train's auc: 0.989772	Evaluation's auc: 0.830868
[4041]	Train's auc: 0.989773	Evaluation's auc: 0.830859
[4042]	Train's auc: 0.989775	Evaluation's auc: 0.830857
[4043]	Train's auc: 0.989777	Evaluation's auc: 0.830857
[4044]	Train's auc: 0.989777	Evaluation's auc: 0.830832
[4045]	Train's auc: 0.989778	Evaluation's auc: 0.830834
[4046]	Train's auc: 0.989781	Evaluation's auc: 0.83083
[4047]	Train's auc: 0.989781	Evaluation's auc: 0.830829
[4048]	Train's auc: 0.989779	Evaluation's auc: 0.830831
[4049]	Train's auc: 0.989777	Evaluation's auc: 0.830831
[4050]	Train's auc: 0.989779	Evaluation's auc: 0.830825
[4051]	Train's auc: 0.98978	Evaluation's auc: 0.830792
[4052]	Train's auc: 0.989778	Evaluation's auc: 0.830794
[4053]	Train's auc: 0.989778	Evaluation's auc: 0.830791
[4054]	Train's auc: 0.989778	Evaluation's auc: 0.830777
[4055]	Train's auc: 0.989778	Evaluation's auc: 0.830777
[4056]	Train's auc: 0.989778	Evaluation's auc: 0.830777
[4057]	Train's auc: 0.989778	Evaluation's auc: 0.830777
[4058]	Train's auc: 0.989779	Evaluation's auc: 0.83077
[4059]	Train's auc: 0.989791	Evaluation's auc: 0.830777
[4060]	Train's auc: 0.989793	Evaluation's auc: 0.830782
[4061]	Train's auc: 0.989795	Evaluation's auc: 0.830783
[4062]	Train's auc: 0.989803	Evaluation's auc: 0.830785
[4063]	Train's auc: 0.989801	Evaluation's auc: 0.830778
[4064]	Train's auc: 0.989799	Evaluation's auc: 0.830785
[4065]	Train's auc: 0.989801	Evaluation's auc: 0.830778
[4066]	Train's auc: 0.989805	Evaluation's auc: 0.830797
[4067]	Train's auc: 0.989808	Evaluation's auc: 0.830785
[4068]	Train's auc: 0.989795	Evaluation's auc: 0.830792
[4069]	Train's auc: 0.989795	Evaluation's auc: 0.830791
[4070]	Train's auc: 0.989798	Evaluation's auc: 0.830768
[4071]	Train's auc: 0.989802	Evaluation's auc: 0.830757
[4072]	Train's auc: 0.989803	Evaluation's auc: 0.830758
[4073]	Train's auc: 0.989811	Evaluation's auc: 0.830749
[4074]	Train's auc: 0.989808	Evaluation's auc: 0.830737
[4075]	Train's auc: 0.98981	Evaluation's auc: 0.830731
[4076]	Train's auc: 0.989817	Evaluation's auc: 0.830714
[4077]	Train's auc: 0.989817	Evaluation's auc: 0.830714
[4078]	Train's auc: 0.98982	Evaluation's auc: 0.830737
[4079]	Train's auc: 0.98982	Evaluation's auc: 0.830743
[4080]	Train's auc: 0.989824	Evaluation's auc: 0.830776
[4081]	Train's auc: 0.989839	Evaluation's auc: 0.830773
[4082]	Train's auc: 0.989835	Evaluation's auc: 0.830782
[4083]	Train's auc: 0.989836	Evaluation's auc: 0.830787
[4084]	Train's auc: 0.989837	Evaluation's auc: 0.830765
[4085]	Train's auc: 0.989841	Evaluation's auc: 0.830754
[4086]	Train's auc: 0.989838	Evaluation's auc: 0.830755
[4087]	Train's auc: 0.989841	Evaluation's auc: 0.830754
[4088]	Train's auc: 0.989841	Evaluation's auc: 0.830783
[4089]	Train's auc: 0.989845	Evaluation's auc: 0.83077
[4090]	Train's auc: 0.989847	Evaluation's auc: 0.830762
[4091]	Train's auc: 0.989848	Evaluation's auc: 0.830768
[4092]	Train's auc: 0.989849	Evaluation's auc: 0.830758
[4093]	Train's auc: 0.98985	Evaluation's auc: 0.830754
[4094]	Train's auc: 0.989853	Evaluation's auc: 0.83075
[4095]	Train's auc: 0.989853	Evaluation's auc: 0.830745
[4096]	Train's auc: 0.989855	Evaluation's auc: 0.830729
[4097]	Train's auc: 0.989851	Evaluation's auc: 0.830748
[4098]	Train's auc: 0.989856	Evaluation's auc: 0.830742
[4099]	Train's auc: 0.989857	Evaluation's auc: 0.830762
[4100]	Train's auc: 0.989858	Evaluation's auc: 0.830761
[4101]	Train's auc: 0.989859	Evaluation's auc: 0.830763
[4102]	Train's auc: 0.989861	Evaluation's auc: 0.830761
[4103]	Train's auc: 0.989859	Evaluation's auc: 0.830751
[4104]	Train's auc: 0.989863	Evaluation's auc: 0.83074
[4105]	Train's auc: 0.989863	Evaluation's auc: 0.830741
[4106]	Train's auc: 0.989871	Evaluation's auc: 0.83076
[4107]	Train's auc: 0.989869	Evaluation's auc: 0.830765
[4108]	Train's auc: 0.989875	Evaluation's auc: 0.830757
[4109]	Train's auc: 0.989879	Evaluation's auc: 0.830763
[4110]	Train's auc: 0.989881	Evaluation's auc: 0.830747
[4111]	Train's auc: 0.989884	Evaluation's auc: 0.830762
[4112]	Train's auc: 0.989889	Evaluation's auc: 0.830777
[4113]	Train's auc: 0.989889	Evaluation's auc: 0.830763
[4114]	Train's auc: 0.989894	Evaluation's auc: 0.83076
[4115]	Train's auc: 0.989893	Evaluation's auc: 0.830758
[4116]	Train's auc: 0.989896	Evaluation's auc: 0.830748
[4117]	Train's auc: 0.989896	Evaluation's auc: 0.830743
[4118]	Train's auc: 0.989896	Evaluation's auc: 0.830736
[4119]	Train's auc: 0.989902	Evaluation's auc: 0.830752
[4120]	Train's auc: 0.989915	Evaluation's auc: 0.83074
[4121]	Train's auc: 0.989915	Evaluation's auc: 0.83074
[4122]	Train's auc: 0.989915	Evaluation's auc: 0.830766
[4123]	Train's auc: 0.989922	Evaluation's auc: 0.830767
[4124]	Train's auc: 0.989922	Evaluation's auc: 0.830755
[4125]	Train's auc: 0.989927	Evaluation's auc: 0.830753
[4126]	Train's auc: 0.989927	Evaluation's auc: 0.830753
[4127]	Train's auc: 0.989931	Evaluation's auc: 0.83075
[4128]	Train's auc: 0.989932	Evaluation's auc: 0.830756
[4129]	Train's auc: 0.989932	Evaluation's auc: 0.830754
[4130]	Train's auc: 0.989934	Evaluation's auc: 0.830749
[4131]	Train's auc: 0.989933	Evaluation's auc: 0.830726
[4132]	Train's auc: 0.989933	Evaluation's auc: 0.830714
[4133]	Train's auc: 0.98993	Evaluation's auc: 0.830727
[4134]	Train's auc: 0.989933	Evaluation's auc: 0.830724
[4135]	Train's auc: 0.989929	Evaluation's auc: 0.830722
[4136]	Train's auc: 0.989934	Evaluation's auc: 0.830695
[4137]	Train's auc: 0.989937	Evaluation's auc: 0.830689
[4138]	Train's auc: 0.989942	Evaluation's auc: 0.830678
[4139]	Train's auc: 0.989942	Evaluation's auc: 0.830675
[4140]	Train's auc: 0.989945	Evaluation's auc: 0.830695
[4141]	Train's auc: 0.989945	Evaluation's auc: 0.830696
[4142]	Train's auc: 0.989946	Evaluation's auc: 0.830676
[4143]	Train's auc: 0.989953	Evaluation's auc: 0.83066
[4144]	Train's auc: 0.989955	Evaluation's auc: 0.830638
[4145]	Train's auc: 0.989963	Evaluation's auc: 0.830628
[4146]	Train's auc: 0.989959	Evaluation's auc: 0.83064
[4147]	Train's auc: 0.98996	Evaluation's auc: 0.830632
[4148]	Train's auc: 0.989965	Evaluation's auc: 0.830625
[4149]	Train's auc: 0.989968	Evaluation's auc: 0.830604
[4150]	Train's auc: 0.98997	Evaluation's auc: 0.830622
[4151]	Train's auc: 0.989977	Evaluation's auc: 0.830619
[4152]	Train's auc: 0.989978	Evaluation's auc: 0.830615
[4153]	Train's auc: 0.989978	Evaluation's auc: 0.830626
[4154]	Train's auc: 0.989979	Evaluation's auc: 0.830627
[4155]	Train's auc: 0.989983	Evaluation's auc: 0.830617
[4156]	Train's auc: 0.989985	Evaluation's auc: 0.830614
[4157]	Train's auc: 0.989986	Evaluation's auc: 0.830608
[4158]	Train's auc: 0.989993	Evaluation's auc: 0.830622
[4159]	Train's auc: 0.989991	Evaluation's auc: 0.830619
[4160]	Train's auc: 0.990001	Evaluation's auc: 0.830626
[4161]	Train's auc: 0.990004	Evaluation's auc: 0.830629
[4162]	Train's auc: 0.990003	Evaluation's auc: 0.830631
[4163]	Train's auc: 0.990004	Evaluation's auc: 0.830638
[4164]	Train's auc: 0.990004	Evaluation's auc: 0.830623
[4165]	Train's auc: 0.99	Evaluation's auc: 0.830625
[4166]	Train's auc: 0.990001	Evaluation's auc: 0.830597
[4167]	Train's auc: 0.990001	Evaluation's auc: 0.830597
[4168]	Train's auc: 0.99	Evaluation's auc: 0.830595
[4169]	Train's auc: 0.989996	Evaluation's auc: 0.830591
[4170]	Train's auc: 0.989998	Evaluation's auc: 0.830569
[4171]	Train's auc: 0.989997	Evaluation's auc: 0.830588
[4172]	Train's auc: 0.990001	Evaluation's auc: 0.830579
[4173]	Train's auc: 0.990001	Evaluation's auc: 0.830579
[4174]	Train's auc: 0.990001	Evaluation's auc: 0.830585
[4175]	Train's auc: 0.990004	Evaluation's auc: 0.830595
[4176]	Train's auc: 0.990006	Evaluation's auc: 0.830585
[4177]	Train's auc: 0.990014	Evaluation's auc: 0.830574
[4178]	Train's auc: 0.990015	Evaluation's auc: 0.830574
[4179]	Train's auc: 0.990015	Evaluation's auc: 0.830568
[4180]	Train's auc: 0.990017	Evaluation's auc: 0.830563
[4181]	Train's auc: 0.990018	Evaluation's auc: 0.83057
[4182]	Train's auc: 0.990023	Evaluation's auc: 0.830579
[4183]	Train's auc: 0.990019	Evaluation's auc: 0.830581
[4184]	Train's auc: 0.990021	Evaluation's auc: 0.83058
[4185]	Train's auc: 0.990021	Evaluation's auc: 0.830579
[4186]	Train's auc: 0.990019	Evaluation's auc: 0.830592
[4187]	Train's auc: 0.990018	Evaluation's auc: 0.830593
[4188]	Train's auc: 0.99002	Evaluation's auc: 0.830599
[4189]	Train's auc: 0.990022	Evaluation's auc: 0.83061
[4190]	Train's auc: 0.990026	Evaluation's auc: 0.830594
[4191]	Train's auc: 0.990028	Evaluation's auc: 0.830595
[4192]	Train's auc: 0.990031	Evaluation's auc: 0.830595
[4193]	Train's auc: 0.990031	Evaluation's auc: 0.830597
[4194]	Train's auc: 0.990036	Evaluation's auc: 0.830587
[4195]	Train's auc: 0.990033	Evaluation's auc: 0.830586
[4196]	Train's auc: 0.990033	Evaluation's auc: 0.830585
[4197]	Train's auc: 0.990036	Evaluation's auc: 0.830576
[4198]	Train's auc: 0.990042	Evaluation's auc: 0.830565
[4199]	Train's auc: 0.990044	Evaluation's auc: 0.830559
[4200]	Train's auc: 0.990051	Evaluation's auc: 0.830565
[4201]	Train's auc: 0.990052	Evaluation's auc: 0.830562
[4202]	Train's auc: 0.990055	Evaluation's auc: 0.830559
[4203]	Train's auc: 0.990055	Evaluation's auc: 0.830573
[4204]	Train's auc: 0.990055	Evaluation's auc: 0.830573
[4205]	Train's auc: 0.990054	Evaluation's auc: 0.830574
[4206]	Train's auc: 0.990054	Evaluation's auc: 0.830564
[4207]	Train's auc: 0.990056	Evaluation's auc: 0.830566
[4208]	Train's auc: 0.990056	Evaluation's auc: 0.830566
[4209]	Train's auc: 0.990057	Evaluation's auc: 0.83058
[4210]	Train's auc: 0.990057	Evaluation's auc: 0.83058
[4211]	Train's auc: 0.990057	Evaluation's auc: 0.83058
[4212]	Train's auc: 0.990059	Evaluation's auc: 0.830568
[4213]	Train's auc: 0.990059	Evaluation's auc: 0.830568
[4214]	Train's auc: 0.99006	Evaluation's auc: 0.830565
[4215]	Train's auc: 0.990064	Evaluation's auc: 0.830576
[4216]	Train's auc: 0.990065	Evaluation's auc: 0.83056
[4217]	Train's auc: 0.990066	Evaluation's auc: 0.830553
[4218]	Train's auc: 0.990066	Evaluation's auc: 0.830551
[4219]	Train's auc: 0.99007	Evaluation's auc: 0.830556
[4220]	Train's auc: 0.990073	Evaluation's auc: 0.830554
[4221]	Train's auc: 0.990071	Evaluation's auc: 0.830551
[4222]	Train's auc: 0.990071	Evaluation's auc: 0.830545
[4223]	Train's auc: 0.990071	Evaluation's auc: 0.830545
[4224]	Train's auc: 0.990071	Evaluation's auc: 0.830545
[4225]	Train's auc: 0.990071	Evaluation's auc: 0.830545
[4226]	Train's auc: 0.990072	Evaluation's auc: 0.830545
[4227]	Train's auc: 0.990072	Evaluation's auc: 0.830544
[4228]	Train's auc: 0.990072	Evaluation's auc: 0.830543
[4229]	Train's auc: 0.990072	Evaluation's auc: 0.830544
[4230]	Train's auc: 0.990078	Evaluation's auc: 0.830541
[4231]	Train's auc: 0.990079	Evaluation's auc: 0.830547
[4232]	Train's auc: 0.990079	Evaluation's auc: 0.830547
[4233]	Train's auc: 0.990079	Evaluation's auc: 0.830547
[4234]	Train's auc: 0.99008	Evaluation's auc: 0.830542
[4235]	Train's auc: 0.99008	Evaluation's auc: 0.830542
[4236]	Train's auc: 0.990084	Evaluation's auc: 0.830539
[4237]	Train's auc: 0.990087	Evaluation's auc: 0.830528
[4238]	Train's auc: 0.990089	Evaluation's auc: 0.830531
[4239]	Train's auc: 0.99009	Evaluation's auc: 0.830554
[4240]	Train's auc: 0.990091	Evaluation's auc: 0.830552
[4241]	Train's auc: 0.990094	Evaluation's auc: 0.830542
[4242]	Train's auc: 0.990095	Evaluation's auc: 0.830525
[4243]	Train's auc: 0.990097	Evaluation's auc: 0.830526
[4244]	Train's auc: 0.990097	Evaluation's auc: 0.830528
[4245]	Train's auc: 0.990098	Evaluation's auc: 0.830536
[4246]	Train's auc: 0.990103	Evaluation's auc: 0.830532
[4247]	Train's auc: 0.990103	Evaluation's auc: 0.830532
[4248]	Train's auc: 0.990103	Evaluation's auc: 0.830534
[4249]	Train's auc: 0.990111	Evaluation's auc: 0.830524
[4250]	Train's auc: 0.99011	Evaluation's auc: 0.830518
[4251]	Train's auc: 0.990107	Evaluation's auc: 0.830533
[4252]	Train's auc: 0.990104	Evaluation's auc: 0.830545
[4253]	Train's auc: 0.990107	Evaluation's auc: 0.83054
[4254]	Train's auc: 0.990114	Evaluation's auc: 0.830529
[4255]	Train's auc: 0.990121	Evaluation's auc: 0.830519
[4256]	Train's auc: 0.990121	Evaluation's auc: 0.830519
[4257]	Train's auc: 0.990126	Evaluation's auc: 0.830514
[4258]	Train's auc: 0.990126	Evaluation's auc: 0.830514
[4259]	Train's auc: 0.990131	Evaluation's auc: 0.830518
[4260]	Train's auc: 0.990132	Evaluation's auc: 0.830526
[4261]	Train's auc: 0.990132	Evaluation's auc: 0.830526
[4262]	Train's auc: 0.990136	Evaluation's auc: 0.83051
[4263]	Train's auc: 0.990146	Evaluation's auc: 0.830506
[4264]	Train's auc: 0.990147	Evaluation's auc: 0.830507
[4265]	Train's auc: 0.990142	Evaluation's auc: 0.83051
[4266]	Train's auc: 0.990139	Evaluation's auc: 0.830505
[4267]	Train's auc: 0.990142	Evaluation's auc: 0.830508
[4268]	Train's auc: 0.990141	Evaluation's auc: 0.830512
[4269]	Train's auc: 0.990143	Evaluation's auc: 0.830495
[4270]	Train's auc: 0.990144	Evaluation's auc: 0.830486
[4271]	Train's auc: 0.990147	Evaluation's auc: 0.830484
[4272]	Train's auc: 0.990148	Evaluation's auc: 0.830477
[4273]	Train's auc: 0.990149	Evaluation's auc: 0.830466
[4274]	Train's auc: 0.990151	Evaluation's auc: 0.83047
[4275]	Train's auc: 0.990152	Evaluation's auc: 0.830467
[4276]	Train's auc: 0.990157	Evaluation's auc: 0.830459
[4277]	Train's auc: 0.990161	Evaluation's auc: 0.830468
[4278]	Train's auc: 0.990167	Evaluation's auc: 0.830483
[4279]	Train's auc: 0.990168	Evaluation's auc: 0.830462
[4280]	Train's auc: 0.990167	Evaluation's auc: 0.830462
[4281]	Train's auc: 0.990169	Evaluation's auc: 0.830461
[4282]	Train's auc: 0.990171	Evaluation's auc: 0.830461
[4283]	Train's auc: 0.990174	Evaluation's auc: 0.830456
[4284]	Train's auc: 0.99018	Evaluation's auc: 0.830468
[4285]	Train's auc: 0.990186	Evaluation's auc: 0.83046
[4286]	Train's auc: 0.990188	Evaluation's auc: 0.830476
[4287]	Train's auc: 0.990187	Evaluation's auc: 0.830494
[4288]	Train's auc: 0.990188	Evaluation's auc: 0.8305
[4289]	Train's auc: 0.990192	Evaluation's auc: 0.8305
[4290]	Train's auc: 0.990191	Evaluation's auc: 0.8305
[4291]	Train's auc: 0.990191	Evaluation's auc: 0.830495
[4292]	Train's auc: 0.990192	Evaluation's auc: 0.83051
[4293]	Train's auc: 0.990192	Evaluation's auc: 0.830521
[4294]	Train's auc: 0.990193	Evaluation's auc: 0.830519
[4295]	Train's auc: 0.990195	Evaluation's auc: 0.830507
[4296]	Train's auc: 0.990194	Evaluation's auc: 0.830506
[4297]	Train's auc: 0.990194	Evaluation's auc: 0.830502
[4298]	Train's auc: 0.990197	Evaluation's auc: 0.83051
[4299]	Train's auc: 0.990201	Evaluation's auc: 0.8305
[4300]	Train's auc: 0.990202	Evaluation's auc: 0.830519
[4301]	Train's auc: 0.990203	Evaluation's auc: 0.830519
[4302]	Train's auc: 0.990205	Evaluation's auc: 0.830516
[4303]	Train's auc: 0.990205	Evaluation's auc: 0.830516
[4304]	Train's auc: 0.990206	Evaluation's auc: 0.830515
[4305]	Train's auc: 0.990206	Evaluation's auc: 0.830515
[4306]	Train's auc: 0.990205	Evaluation's auc: 0.830516
[4307]	Train's auc: 0.990205	Evaluation's auc: 0.830516
[4308]	Train's auc: 0.990204	Evaluation's auc: 0.830516
[4309]	Train's auc: 0.990204	Evaluation's auc: 0.830516
[4310]	Train's auc: 0.990204	Evaluation's auc: 0.830517
[4311]	Train's auc: 0.990204	Evaluation's auc: 0.830514
[4312]	Train's auc: 0.990203	Evaluation's auc: 0.830517
[4313]	Train's auc: 0.990203	Evaluation's auc: 0.830517
[4314]	Train's auc: 0.990203	Evaluation's auc: 0.830517
[4315]	Train's auc: 0.990205	Evaluation's auc: 0.830506
[4316]	Train's auc: 0.990219	Evaluation's auc: 0.830501
[4317]	Train's auc: 0.99022	Evaluation's auc: 0.830501
[4318]	Train's auc: 0.990221	Evaluation's auc: 0.8305
[4319]	Train's auc: 0.990221	Evaluation's auc: 0.8305
[4320]	Train's auc: 0.990224	Evaluation's auc: 0.830488
[4321]	Train's auc: 0.990224	Evaluation's auc: 0.830487
[4322]	Train's auc: 0.990224	Evaluation's auc: 0.830487
[4323]	Train's auc: 0.990223	Evaluation's auc: 0.830485
[4324]	Train's auc: 0.990224	Evaluation's auc: 0.830474
[4325]	Train's auc: 0.990228	Evaluation's auc: 0.830452
[4326]	Train's auc: 0.990228	Evaluation's auc: 0.830452
[4327]	Train's auc: 0.990228	Evaluation's auc: 0.830467
[4328]	Train's auc: 0.990231	Evaluation's auc: 0.830457
[4329]	Train's auc: 0.990231	Evaluation's auc: 0.830456
[4330]	Train's auc: 0.990231	Evaluation's auc: 0.830456
[4331]	Train's auc: 0.990232	Evaluation's auc: 0.830455
[4332]	Train's auc: 0.990232	Evaluation's auc: 0.830455
[4333]	Train's auc: 0.990232	Evaluation's auc: 0.830455
[4334]	Train's auc: 0.990232	Evaluation's auc: 0.830455
[4335]	Train's auc: 0.990232	Evaluation's auc: 0.830455
[4336]	Train's auc: 0.990232	Evaluation's auc: 0.830455
[4337]	Train's auc: 0.990232	Evaluation's auc: 0.830455
[4338]	Train's auc: 0.990232	Evaluation's auc: 0.830455
[4339]	Train's auc: 0.990232	Evaluation's auc: 0.830455
[4340]	Train's auc: 0.990232	Evaluation's auc: 0.830455
[4341]	Train's auc: 0.990234	Evaluation's auc: 0.830446
[4342]	Train's auc: 0.990234	Evaluation's auc: 0.830446
[4343]	Train's auc: 0.990234	Evaluation's auc: 0.830445
[4344]	Train's auc: 0.990234	Evaluation's auc: 0.830445
[4345]	Train's auc: 0.990234	Evaluation's auc: 0.830447
[4346]	Train's auc: 0.990234	Evaluation's auc: 0.830448
[4347]	Train's auc: 0.990234	Evaluation's auc: 0.830448
[4348]	Train's auc: 0.990234	Evaluation's auc: 0.830448
[4349]	Train's auc: 0.990234	Evaluation's auc: 0.830448
[4350]	Train's auc: 0.990234	Evaluation's auc: 0.830448
[4351]	Train's auc: 0.990234	Evaluation's auc: 0.830448
[4352]	Train's auc: 0.990234	Evaluation's auc: 0.830448
[4353]	Train's auc: 0.990234	Evaluation's auc: 0.830459
[4354]	Train's auc: 0.990239	Evaluation's auc: 0.830425
[4355]	Train's auc: 0.990239	Evaluation's auc: 0.830425
[4356]	Train's auc: 0.990239	Evaluation's auc: 0.830426
[4357]	Train's auc: 0.990239	Evaluation's auc: 0.830436
[4358]	Train's auc: 0.99024	Evaluation's auc: 0.830434
[4359]	Train's auc: 0.990243	Evaluation's auc: 0.830425
[4360]	Train's auc: 0.990244	Evaluation's auc: 0.830415
[4361]	Train's auc: 0.990244	Evaluation's auc: 0.830415
[4362]	Train's auc: 0.990244	Evaluation's auc: 0.830415
[4363]	Train's auc: 0.990244	Evaluation's auc: 0.830416
[4364]	Train's auc: 0.990244	Evaluation's auc: 0.830416
[4365]	Train's auc: 0.990244	Evaluation's auc: 0.830421
[4366]	Train's auc: 0.990244	Evaluation's auc: 0.830421
[4367]	Train's auc: 0.990244	Evaluation's auc: 0.830421
[4368]	Train's auc: 0.990244	Evaluation's auc: 0.830421
[4369]	Train's auc: 0.990244	Evaluation's auc: 0.830421
[4370]	Train's auc: 0.990244	Evaluation's auc: 0.830421
[4371]	Train's auc: 0.990244	Evaluation's auc: 0.830423
[4372]	Train's auc: 0.990244	Evaluation's auc: 0.830423
[4373]	Train's auc: 0.990244	Evaluation's auc: 0.830423
[4374]	Train's auc: 0.990244	Evaluation's auc: 0.830424
[4375]	Train's auc: 0.990243	Evaluation's auc: 0.830424
[4376]	Train's auc: 0.990243	Evaluation's auc: 0.830424
[4377]	Train's auc: 0.990243	Evaluation's auc: 0.830422
[4378]	Train's auc: 0.990243	Evaluation's auc: 0.830422
[4379]	Train's auc: 0.990243	Evaluation's auc: 0.830422
[4380]	Train's auc: 0.990243	Evaluation's auc: 0.830422
[4381]	Train's auc: 0.990243	Evaluation's auc: 0.830423
[4382]	Train's auc: 0.990244	Evaluation's auc: 0.830451
[4383]	Train's auc: 0.990252	Evaluation's auc: 0.830471
[4384]	Train's auc: 0.990251	Evaluation's auc: 0.830471
[4385]	Train's auc: 0.990255	Evaluation's auc: 0.830471
[4386]	Train's auc: 0.990255	Evaluation's auc: 0.830463
[4387]	Train's auc: 0.990259	Evaluation's auc: 0.830451
[4388]	Train's auc: 0.990257	Evaluation's auc: 0.830452
[4389]	Train's auc: 0.990262	Evaluation's auc: 0.830448
[4390]	Train's auc: 0.99026	Evaluation's auc: 0.830447
[4391]	Train's auc: 0.99026	Evaluation's auc: 0.830442
[4392]	Train's auc: 0.990261	Evaluation's auc: 0.830424
[4393]	Train's auc: 0.990263	Evaluation's auc: 0.830438
[4394]	Train's auc: 0.990261	Evaluation's auc: 0.830434
[4395]	Train's auc: 0.990264	Evaluation's auc: 0.830431
[4396]	Train's auc: 0.990267	Evaluation's auc: 0.830428
[4397]	Train's auc: 0.990267	Evaluation's auc: 0.830429
[4398]	Train's auc: 0.990265	Evaluation's auc: 0.830433
[4399]	Train's auc: 0.990269	Evaluation's auc: 0.830429
[4400]	Train's auc: 0.990276	Evaluation's auc: 0.83046
[4401]	Train's auc: 0.990276	Evaluation's auc: 0.83047
[4402]	Train's auc: 0.990275	Evaluation's auc: 0.830476
[4403]	Train's auc: 0.990276	Evaluation's auc: 0.830473
[4404]	Train's auc: 0.990276	Evaluation's auc: 0.830473
[4405]	Train's auc: 0.990275	Evaluation's auc: 0.830474
[4406]	Train's auc: 0.990275	Evaluation's auc: 0.830474
[4407]	Train's auc: 0.990275	Evaluation's auc: 0.830474
[4408]	Train's auc: 0.990275	Evaluation's auc: 0.830471
[4409]	Train's auc: 0.990278	Evaluation's auc: 0.830465
[4410]	Train's auc: 0.990276	Evaluation's auc: 0.830465
[4411]	Train's auc: 0.990277	Evaluation's auc: 0.830467
[4412]	Train's auc: 0.99028	Evaluation's auc: 0.830481
[4413]	Train's auc: 0.990288	Evaluation's auc: 0.830467
[4414]	Train's auc: 0.99029	Evaluation's auc: 0.830464
[4415]	Train's auc: 0.990295	Evaluation's auc: 0.830472
[4416]	Train's auc: 0.990296	Evaluation's auc: 0.830467
[4417]	Train's auc: 0.990297	Evaluation's auc: 0.830477
[4418]	Train's auc: 0.990298	Evaluation's auc: 0.8305
[4419]	Train's auc: 0.990299	Evaluation's auc: 0.830487
[4420]	Train's auc: 0.990303	Evaluation's auc: 0.830483
[4421]	Train's auc: 0.990304	Evaluation's auc: 0.830482
[4422]	Train's auc: 0.990305	Evaluation's auc: 0.830485
[4423]	Train's auc: 0.990306	Evaluation's auc: 0.830482
[4424]	Train's auc: 0.990306	Evaluation's auc: 0.830482
[4425]	Train's auc: 0.990306	Evaluation's auc: 0.830483
[4426]	Train's auc: 0.990308	Evaluation's auc: 0.830474
[4427]	Train's auc: 0.990309	Evaluation's auc: 0.830481
[4428]	Train's auc: 0.990311	Evaluation's auc: 0.830469
[4429]	Train's auc: 0.990312	Evaluation's auc: 0.830491
[4430]	Train's auc: 0.990312	Evaluation's auc: 0.830489
[4431]	Train's auc: 0.990312	Evaluation's auc: 0.830487
[4432]	Train's auc: 0.990313	Evaluation's auc: 0.830488
[4433]	Train's auc: 0.990314	Evaluation's auc: 0.830482
[4434]	Train's auc: 0.99032	Evaluation's auc: 0.830472
[4435]	Train's auc: 0.990325	Evaluation's auc: 0.830451
[4436]	Train's auc: 0.99033	Evaluation's auc: 0.830475
[4437]	Train's auc: 0.990331	Evaluation's auc: 0.830485
[4438]	Train's auc: 0.990331	Evaluation's auc: 0.830481
[4439]	Train's auc: 0.99033	Evaluation's auc: 0.830469
[4440]	Train's auc: 0.99033	Evaluation's auc: 0.830448
[4441]	Train's auc: 0.990332	Evaluation's auc: 0.830443
[4442]	Train's auc: 0.990337	Evaluation's auc: 0.830442
[4443]	Train's auc: 0.990336	Evaluation's auc: 0.830443
[4444]	Train's auc: 0.990337	Evaluation's auc: 0.830443
[4445]	Train's auc: 0.990339	Evaluation's auc: 0.83045
[4446]	Train's auc: 0.990331	Evaluation's auc: 0.830447
[4447]	Train's auc: 0.990335	Evaluation's auc: 0.830422
[4448]	Train's auc: 0.990351	Evaluation's auc: 0.830408
[4449]	Train's auc: 0.990353	Evaluation's auc: 0.830412
[4450]	Train's auc: 0.990353	Evaluation's auc: 0.830413
[4451]	Train's auc: 0.990356	Evaluation's auc: 0.830415
[4452]	Train's auc: 0.990356	Evaluation's auc: 0.830408
[4453]	Train's auc: 0.990368	Evaluation's auc: 0.830422
[4454]	Train's auc: 0.990368	Evaluation's auc: 0.830404
[4455]	Train's auc: 0.990373	Evaluation's auc: 0.830408
[4456]	Train's auc: 0.990373	Evaluation's auc: 0.830382
[4457]	Train's auc: 0.990376	Evaluation's auc: 0.830389
[4458]	Train's auc: 0.990377	Evaluation's auc: 0.830388
[4459]	Train's auc: 0.99038	Evaluation's auc: 0.830384
[4460]	Train's auc: 0.990381	Evaluation's auc: 0.830385
[4461]	Train's auc: 0.99038	Evaluation's auc: 0.830367
[4462]	Train's auc: 0.990381	Evaluation's auc: 0.830382
[4463]	Train's auc: 0.990382	Evaluation's auc: 0.830367
[4464]	Train's auc: 0.990384	Evaluation's auc: 0.830374
[4465]	Train's auc: 0.990384	Evaluation's auc: 0.83037
[4466]	Train's auc: 0.990384	Evaluation's auc: 0.830368
[4467]	Train's auc: 0.990387	Evaluation's auc: 0.830367
[4468]	Train's auc: 0.990387	Evaluation's auc: 0.830367
[4469]	Train's auc: 0.990388	Evaluation's auc: 0.830355
[4470]	Train's auc: 0.990388	Evaluation's auc: 0.830355
[4471]	Train's auc: 0.990388	Evaluation's auc: 0.830355
[4472]	Train's auc: 0.990388	Evaluation's auc: 0.830356
[4473]	Train's auc: 0.990388	Evaluation's auc: 0.830355
[4474]	Train's auc: 0.990388	Evaluation's auc: 0.830376
[4475]	Train's auc: 0.990387	Evaluation's auc: 0.830376
[4476]	Train's auc: 0.990382	Evaluation's auc: 0.830379
[4477]	Train's auc: 0.990385	Evaluation's auc: 0.830377
[4478]	Train's auc: 0.990386	Evaluation's auc: 0.830374
[4479]	Train's auc: 0.990387	Evaluation's auc: 0.830374
[4480]	Train's auc: 0.990389	Evaluation's auc: 0.830372
[4481]	Train's auc: 0.990389	Evaluation's auc: 0.830373
[4482]	Train's auc: 0.990389	Evaluation's auc: 0.830373
[4483]	Train's auc: 0.990389	Evaluation's auc: 0.830366
[4484]	Train's auc: 0.990393	Evaluation's auc: 0.830367
[4485]	Train's auc: 0.990394	Evaluation's auc: 0.830372
[4486]	Train's auc: 0.990393	Evaluation's auc: 0.830373
[4487]	Train's auc: 0.990397	Evaluation's auc: 0.830388
[4488]	Train's auc: 0.990398	Evaluation's auc: 0.830382
[4489]	Train's auc: 0.990398	Evaluation's auc: 0.830382
[4490]	Train's auc: 0.990398	Evaluation's auc: 0.830382
[4491]	Train's auc: 0.990395	Evaluation's auc: 0.830391
[4492]	Train's auc: 0.990395	Evaluation's auc: 0.830391
[4493]	Train's auc: 0.990399	Evaluation's auc: 0.830398
[4494]	Train's auc: 0.990399	Evaluation's auc: 0.830398
[4495]	Train's auc: 0.990399	Evaluation's auc: 0.830398
[4496]	Train's auc: 0.990399	Evaluation's auc: 0.830398
[4497]	Train's auc: 0.990399	Evaluation's auc: 0.830403
[4498]	Train's auc: 0.990401	Evaluation's auc: 0.830414
[4499]	Train's auc: 0.990406	Evaluation's auc: 0.830397
[4500]	Train's auc: 0.990406	Evaluation's auc: 0.830397
[4501]	Train's auc: 0.990408	Evaluation's auc: 0.830413
[4502]	Train's auc: 0.990409	Evaluation's auc: 0.830406
[4503]	Train's auc: 0.990409	Evaluation's auc: 0.830406
[4504]	Train's auc: 0.990409	Evaluation's auc: 0.830406
[4505]	Train's auc: 0.990409	Evaluation's auc: 0.830406
[4506]	Train's auc: 0.990409	Evaluation's auc: 0.830406
[4507]	Train's auc: 0.990409	Evaluation's auc: 0.830406
[4508]	Train's auc: 0.990409	Evaluation's auc: 0.830406
[4509]	Train's auc: 0.990409	Evaluation's auc: 0.830406
[4510]	Train's auc: 0.990409	Evaluation's auc: 0.830396
[4511]	Train's auc: 0.990403	Evaluation's auc: 0.830421
[4512]	Train's auc: 0.990404	Evaluation's auc: 0.830421
[4513]	Train's auc: 0.990402	Evaluation's auc: 0.830427
[4514]	Train's auc: 0.990402	Evaluation's auc: 0.830427
[4515]	Train's auc: 0.990403	Evaluation's auc: 0.830428
[4516]	Train's auc: 0.990404	Evaluation's auc: 0.830448
[4517]	Train's auc: 0.990406	Evaluation's auc: 0.830448
[4518]	Train's auc: 0.990407	Evaluation's auc: 0.830441
[4519]	Train's auc: 0.990413	Evaluation's auc: 0.83043
[4520]	Train's auc: 0.990413	Evaluation's auc: 0.830427
[4521]	Train's auc: 0.990411	Evaluation's auc: 0.83042
[4522]	Train's auc: 0.990411	Evaluation's auc: 0.83042
[4523]	Train's auc: 0.990411	Evaluation's auc: 0.83042
[4524]	Train's auc: 0.990413	Evaluation's auc: 0.830423
[4525]	Train's auc: 0.990411	Evaluation's auc: 0.830428
[4526]	Train's auc: 0.990422	Evaluation's auc: 0.830402
[4527]	Train's auc: 0.990425	Evaluation's auc: 0.830395
[4528]	Train's auc: 0.990433	Evaluation's auc: 0.830373
[4529]	Train's auc: 0.990435	Evaluation's auc: 0.830344
[4530]	Train's auc: 0.990437	Evaluation's auc: 0.830345
[4531]	Train's auc: 0.990438	Evaluation's auc: 0.830365
[4532]	Train's auc: 0.990438	Evaluation's auc: 0.830348
[4533]	Train's auc: 0.990441	Evaluation's auc: 0.830365
[4534]	Train's auc: 0.99044	Evaluation's auc: 0.830379
[4535]	Train's auc: 0.990441	Evaluation's auc: 0.830364
[4536]	Train's auc: 0.99044	Evaluation's auc: 0.830364
[4537]	Train's auc: 0.99044	Evaluation's auc: 0.830364
[4538]	Train's auc: 0.99044	Evaluation's auc: 0.830364
[4539]	Train's auc: 0.990441	Evaluation's auc: 0.830381
[4540]	Train's auc: 0.990444	Evaluation's auc: 0.830368
[4541]	Train's auc: 0.990447	Evaluation's auc: 0.830356
[4542]	Train's auc: 0.990448	Evaluation's auc: 0.830363
[4543]	Train's auc: 0.990447	Evaluation's auc: 0.83038
[4544]	Train's auc: 0.990448	Evaluation's auc: 0.830372
[4545]	Train's auc: 0.990449	Evaluation's auc: 0.830373
[4546]	Train's auc: 0.990449	Evaluation's auc: 0.830374
[4547]	Train's auc: 0.990453	Evaluation's auc: 0.830372
[4548]	Train's auc: 0.990454	Evaluation's auc: 0.830384
[4549]	Train's auc: 0.990455	Evaluation's auc: 0.830376
[4550]	Train's auc: 0.990455	Evaluation's auc: 0.830369
[4551]	Train's auc: 0.990455	Evaluation's auc: 0.830369
[4552]	Train's auc: 0.990464	Evaluation's auc: 0.830366
[4553]	Train's auc: 0.990466	Evaluation's auc: 0.830359
[4554]	Train's auc: 0.990466	Evaluation's auc: 0.830372
[4555]	Train's auc: 0.990465	Evaluation's auc: 0.830374
[4556]	Train's auc: 0.99047	Evaluation's auc: 0.830367
[4557]	Train's auc: 0.99047	Evaluation's auc: 0.830367
[4558]	Train's auc: 0.990469	Evaluation's auc: 0.830367
[4559]	Train's auc: 0.990469	Evaluation's auc: 0.830368
[4560]	Train's auc: 0.990469	Evaluation's auc: 0.830366
[4561]	Train's auc: 0.990469	Evaluation's auc: 0.830366
[4562]	Train's auc: 0.990469	Evaluation's auc: 0.830366
[4563]	Train's auc: 0.990469	Evaluation's auc: 0.830366
[4564]	Train's auc: 0.990469	Evaluation's auc: 0.830367
[4565]	Train's auc: 0.99047	Evaluation's auc: 0.830365
[4566]	Train's auc: 0.990472	Evaluation's auc: 0.830343
[4567]	Train's auc: 0.990476	Evaluation's auc: 0.830339
[4568]	Train's auc: 0.990477	Evaluation's auc: 0.830346
[4569]	Train's auc: 0.990479	Evaluation's auc: 0.830324
[4570]	Train's auc: 0.990482	Evaluation's auc: 0.830293
[4571]	Train's auc: 0.99048	Evaluation's auc: 0.830306
[4572]	Train's auc: 0.990483	Evaluation's auc: 0.830285
[4573]	Train's auc: 0.990484	Evaluation's auc: 0.830288
[4574]	Train's auc: 0.990487	Evaluation's auc: 0.830278
[4575]	Train's auc: 0.990492	Evaluation's auc: 0.830268
[4576]	Train's auc: 0.990495	Evaluation's auc: 0.83026
[4577]	Train's auc: 0.9905	Evaluation's auc: 0.830268
[4578]	Train's auc: 0.990502	Evaluation's auc: 0.830248
[4579]	Train's auc: 0.990502	Evaluation's auc: 0.83024
[4580]	Train's auc: 0.990504	Evaluation's auc: 0.830244
[4581]	Train's auc: 0.990506	Evaluation's auc: 0.830261
[4582]	Train's auc: 0.990509	Evaluation's auc: 0.830259
[4583]	Train's auc: 0.99051	Evaluation's auc: 0.830257
[4584]	Train's auc: 0.990513	Evaluation's auc: 0.830254
[4585]	Train's auc: 0.990511	Evaluation's auc: 0.830263
[4586]	Train's auc: 0.990511	Evaluation's auc: 0.830257
[4587]	Train's auc: 0.990523	Evaluation's auc: 0.830267
[4588]	Train's auc: 0.990523	Evaluation's auc: 0.830265
[4589]	Train's auc: 0.990524	Evaluation's auc: 0.830258
[4590]	Train's auc: 0.990525	Evaluation's auc: 0.830244
[4591]	Train's auc: 0.990525	Evaluation's auc: 0.830244
[4592]	Train's auc: 0.990527	Evaluation's auc: 0.830234
[4593]	Train's auc: 0.990527	Evaluation's auc: 0.830234
[4594]	Train's auc: 0.990529	Evaluation's auc: 0.830225
[4595]	Train's auc: 0.99053	Evaluation's auc: 0.830225
[4596]	Train's auc: 0.990528	Evaluation's auc: 0.83022
[4597]	Train's auc: 0.990528	Evaluation's auc: 0.830223
[4598]	Train's auc: 0.99053	Evaluation's auc: 0.830229
[4599]	Train's auc: 0.99053	Evaluation's auc: 0.830232
[4600]	Train's auc: 0.990532	Evaluation's auc: 0.830223
[4601]	Train's auc: 0.990534	Evaluation's auc: 0.830217
[4602]	Train's auc: 0.990534	Evaluation's auc: 0.830223
[4603]	Train's auc: 0.990533	Evaluation's auc: 0.830219
[4604]	Train's auc: 0.990536	Evaluation's auc: 0.830206
[4605]	Train's auc: 0.990536	Evaluation's auc: 0.830207
[4606]	Train's auc: 0.990536	Evaluation's auc: 0.830211
[4607]	Train's auc: 0.990536	Evaluation's auc: 0.830211
[4608]	Train's auc: 0.990538	Evaluation's auc: 0.830224
[4609]	Train's auc: 0.99054	Evaluation's auc: 0.830224
[4610]	Train's auc: 0.990542	Evaluation's auc: 0.83022
[4611]	Train's auc: 0.990546	Evaluation's auc: 0.83021
[4612]	Train's auc: 0.990548	Evaluation's auc: 0.830195
[4613]	Train's auc: 0.990551	Evaluation's auc: 0.830194
[4614]	Train's auc: 0.990554	Evaluation's auc: 0.830192
[4615]	Train's auc: 0.990553	Evaluation's auc: 0.830192
[4616]	Train's auc: 0.990554	Evaluation's auc: 0.83021
[4617]	Train's auc: 0.990554	Evaluation's auc: 0.83021
[4618]	Train's auc: 0.990554	Evaluation's auc: 0.830215
[4619]	Train's auc: 0.99056	Evaluation's auc: 0.830213
[4620]	Train's auc: 0.990559	Evaluation's auc: 0.830212
[4621]	Train's auc: 0.99056	Evaluation's auc: 0.830208
[4622]	Train's auc: 0.990561	Evaluation's auc: 0.830209
[4623]	Train's auc: 0.990561	Evaluation's auc: 0.830207
[4624]	Train's auc: 0.990561	Evaluation's auc: 0.830207
[4625]	Train's auc: 0.990561	Evaluation's auc: 0.830207
[4626]	Train's auc: 0.990561	Evaluation's auc: 0.830207
[4627]	Train's auc: 0.990561	Evaluation's auc: 0.830207
[4628]	Train's auc: 0.990561	Evaluation's auc: 0.830192
[4629]	Train's auc: 0.990563	Evaluation's auc: 0.830188
[4630]	Train's auc: 0.990563	Evaluation's auc: 0.830188
[4631]	Train's auc: 0.990561	Evaluation's auc: 0.830191
[4632]	Train's auc: 0.990563	Evaluation's auc: 0.830177
[4633]	Train's auc: 0.990565	Evaluation's auc: 0.830178
[4634]	Train's auc: 0.99057	Evaluation's auc: 0.830173
[4635]	Train's auc: 0.990574	Evaluation's auc: 0.830189
[4636]	Train's auc: 0.990581	Evaluation's auc: 0.830171
[4637]	Train's auc: 0.990581	Evaluation's auc: 0.830171
[4638]	Train's auc: 0.990587	Evaluation's auc: 0.830209
[4639]	Train's auc: 0.990588	Evaluation's auc: 0.830201
[4640]	Train's auc: 0.990591	Evaluation's auc: 0.830199
[4641]	Train's auc: 0.990592	Evaluation's auc: 0.830203
[4642]	Train's auc: 0.990596	Evaluation's auc: 0.830201
[4643]	Train's auc: 0.9906	Evaluation's auc: 0.830201
[4644]	Train's auc: 0.990602	Evaluation's auc: 0.830209
[4645]	Train's auc: 0.990603	Evaluation's auc: 0.830207
[4646]	Train's auc: 0.990604	Evaluation's auc: 0.830204
[4647]	Train's auc: 0.990604	Evaluation's auc: 0.830203
[4648]	Train's auc: 0.990604	Evaluation's auc: 0.830202
[4649]	Train's auc: 0.990607	Evaluation's auc: 0.830216
[4650]	Train's auc: 0.990609	Evaluation's auc: 0.830204
[4651]	Train's auc: 0.990611	Evaluation's auc: 0.830195
[4652]	Train's auc: 0.990612	Evaluation's auc: 0.830194
[4653]	Train's auc: 0.990613	Evaluation's auc: 0.830182
[4654]	Train's auc: 0.990614	Evaluation's auc: 0.830174
[4655]	Train's auc: 0.990614	Evaluation's auc: 0.83018
[4656]	Train's auc: 0.990614	Evaluation's auc: 0.83018
[4657]	Train's auc: 0.990614	Evaluation's auc: 0.83018
[4658]	Train's auc: 0.990614	Evaluation's auc: 0.83018
[4659]	Train's auc: 0.990613	Evaluation's auc: 0.830179
[4660]	Train's auc: 0.990614	Evaluation's auc: 0.830188
[4661]	Train's auc: 0.990615	Evaluation's auc: 0.830196
[4662]	Train's auc: 0.990633	Evaluation's auc: 0.830198
[4663]	Train's auc: 0.990633	Evaluation's auc: 0.830199
[4664]	Train's auc: 0.990633	Evaluation's auc: 0.830199
[4665]	Train's auc: 0.990633	Evaluation's auc: 0.8302
[4666]	Train's auc: 0.990634	Evaluation's auc: 0.830202
[4667]	Train's auc: 0.990637	Evaluation's auc: 0.830169
[4668]	Train's auc: 0.990636	Evaluation's auc: 0.830173
[4669]	Train's auc: 0.990636	Evaluation's auc: 0.830173
[4670]	Train's auc: 0.990636	Evaluation's auc: 0.830173
[4671]	Train's auc: 0.990636	Evaluation's auc: 0.830171
[4672]	Train's auc: 0.990636	Evaluation's auc: 0.830172
[4673]	Train's auc: 0.990636	Evaluation's auc: 0.830172
[4674]	Train's auc: 0.990636	Evaluation's auc: 0.830172
[4675]	Train's auc: 0.990637	Evaluation's auc: 0.830164
[4676]	Train's auc: 0.990641	Evaluation's auc: 0.830157
[4677]	Train's auc: 0.990641	Evaluation's auc: 0.830157
[4678]	Train's auc: 0.990643	Evaluation's auc: 0.830149
[4679]	Train's auc: 0.990644	Evaluation's auc: 0.830146
[4680]	Train's auc: 0.990645	Evaluation's auc: 0.830146
[4681]	Train's auc: 0.990645	Evaluation's auc: 0.830145
[4682]	Train's auc: 0.990645	Evaluation's auc: 0.830147
[4683]	Train's auc: 0.990645	Evaluation's auc: 0.830147
[4684]	Train's auc: 0.990645	Evaluation's auc: 0.830151
[4685]	Train's auc: 0.990645	Evaluation's auc: 0.830151
[4686]	Train's auc: 0.990645	Evaluation's auc: 0.830151
[4687]	Train's auc: 0.990644	Evaluation's auc: 0.830154
[4688]	Train's auc: 0.990642	Evaluation's auc: 0.830137
[4689]	Train's auc: 0.990641	Evaluation's auc: 0.830137
[4690]	Train's auc: 0.990641	Evaluation's auc: 0.830137
[4691]	Train's auc: 0.99064	Evaluation's auc: 0.83014
[4692]	Train's auc: 0.99064	Evaluation's auc: 0.83014
[4693]	Train's auc: 0.990641	Evaluation's auc: 0.830134
[4694]	Train's auc: 0.990641	Evaluation's auc: 0.830134
[4695]	Train's auc: 0.990641	Evaluation's auc: 0.830134
[4696]	Train's auc: 0.990641	Evaluation's auc: 0.830134
[4697]	Train's auc: 0.990641	Evaluation's auc: 0.830134
[4698]	Train's auc: 0.990641	Evaluation's auc: 0.830134
[4699]	Train's auc: 0.990641	Evaluation's auc: 0.830135
[4700]	Train's auc: 0.990641	Evaluation's auc: 0.830135
[4701]	Train's auc: 0.990641	Evaluation's auc: 0.830135
[4702]	Train's auc: 0.990641	Evaluation's auc: 0.830134
[4703]	Train's auc: 0.990642	Evaluation's auc: 0.830133
[4704]	Train's auc: 0.990643	Evaluation's auc: 0.83013
[4705]	Train's auc: 0.990643	Evaluation's auc: 0.830124
[4706]	Train's auc: 0.990643	Evaluation's auc: 0.830125
[4707]	Train's auc: 0.990643	Evaluation's auc: 0.830125
[4708]	Train's auc: 0.990643	Evaluation's auc: 0.830125
[4709]	Train's auc: 0.990643	Evaluation's auc: 0.830124
[4710]	Train's auc: 0.990643	Evaluation's auc: 0.830124
[4711]	Train's auc: 0.990643	Evaluation's auc: 0.830125
[4712]	Train's auc: 0.990643	Evaluation's auc: 0.830126
[4713]	Train's auc: 0.990642	Evaluation's auc: 0.830127
[4714]	Train's auc: 0.990642	Evaluation's auc: 0.830127
[4715]	Train's auc: 0.990642	Evaluation's auc: 0.830138
[4716]	Train's auc: 0.990643	Evaluation's auc: 0.830137
[4717]	Train's auc: 0.990643	Evaluation's auc: 0.83014
[4718]	Train's auc: 0.990643	Evaluation's auc: 0.830141
[4719]	Train's auc: 0.990643	Evaluation's auc: 0.830141
[4720]	Train's auc: 0.990645	Evaluation's auc: 0.830132
[4721]	Train's auc: 0.990647	Evaluation's auc: 0.830146
[4722]	Train's auc: 0.990649	Evaluation's auc: 0.830127
[4723]	Train's auc: 0.990649	Evaluation's auc: 0.830127
[4724]	Train's auc: 0.990649	Evaluation's auc: 0.830127
[4725]	Train's auc: 0.990648	Evaluation's auc: 0.830129
[4726]	Train's auc: 0.990648	Evaluation's auc: 0.830129
[4727]	Train's auc: 0.990648	Evaluation's auc: 0.830131
[4728]	Train's auc: 0.990648	Evaluation's auc: 0.830114
[4729]	Train's auc: 0.990648	Evaluation's auc: 0.830114
[4730]	Train's auc: 0.990648	Evaluation's auc: 0.830114
[4731]	Train's auc: 0.990648	Evaluation's auc: 0.830114
[4732]	Train's auc: 0.990648	Evaluation's auc: 0.830114
[4733]	Train's auc: 0.990649	Evaluation's auc: 0.830113
[4734]	Train's auc: 0.990649	Evaluation's auc: 0.830113
[4735]	Train's auc: 0.990648	Evaluation's auc: 0.830115
[4736]	Train's auc: 0.990653	Evaluation's auc: 0.830091
[4737]	Train's auc: 0.990653	Evaluation's auc: 0.830095
[4738]	Train's auc: 0.990658	Evaluation's auc: 0.830087
[4739]	Train's auc: 0.990658	Evaluation's auc: 0.830087
[4740]	Train's auc: 0.990658	Evaluation's auc: 0.830087
[4741]	Train's auc: 0.990658	Evaluation's auc: 0.830087
[4742]	Train's auc: 0.990658	Evaluation's auc: 0.830087
[4743]	Train's auc: 0.990658	Evaluation's auc: 0.830087
[4744]	Train's auc: 0.990658	Evaluation's auc: 0.830087
[4745]	Train's auc: 0.990658	Evaluation's auc: 0.830087
[4746]	Train's auc: 0.990658	Evaluation's auc: 0.830087
[4747]	Train's auc: 0.990659	Evaluation's auc: 0.830086
[4748]	Train's auc: 0.990659	Evaluation's auc: 0.830086
[4749]	Train's auc: 0.990659	Evaluation's auc: 0.830086
[4750]	Train's auc: 0.990659	Evaluation's auc: 0.830086
[4751]	Train's auc: 0.990659	Evaluation's auc: 0.830085
[4752]	Train's auc: 0.990659	Evaluation's auc: 0.830085
[4753]	Train's auc: 0.990659	Evaluation's auc: 0.830085
[4754]	Train's auc: 0.99066	Evaluation's auc: 0.830082
[4755]	Train's auc: 0.990664	Evaluation's auc: 0.830083
[4756]	Train's auc: 0.990664	Evaluation's auc: 0.830078
[4757]	Train's auc: 0.990664	Evaluation's auc: 0.830076
[4758]	Train's auc: 0.990664	Evaluation's auc: 0.830077
[4759]	Train's auc: 0.990664	Evaluation's auc: 0.830077
[4760]	Train's auc: 0.990664	Evaluation's auc: 0.830079
[4761]	Train's auc: 0.990664	Evaluation's auc: 0.830079
[4762]	Train's auc: 0.990664	Evaluation's auc: 0.83008
[4763]	Train's auc: 0.990664	Evaluation's auc: 0.83008
[4764]	Train's auc: 0.990664	Evaluation's auc: 0.830081
[4765]	Train's auc: 0.990664	Evaluation's auc: 0.830082
[4766]	Train's auc: 0.990664	Evaluation's auc: 0.830082
[4767]	Train's auc: 0.990664	Evaluation's auc: 0.830082
[4768]	Train's auc: 0.990664	Evaluation's auc: 0.830082
[4769]	Train's auc: 0.990664	Evaluation's auc: 0.830082
[4770]	Train's auc: 0.990664	Evaluation's auc: 0.830082
[4771]	Train's auc: 0.990663	Evaluation's auc: 0.830097
[4772]	Train's auc: 0.990663	Evaluation's auc: 0.830097
[4773]	Train's auc: 0.990663	Evaluation's auc: 0.830097
[4774]	Train's auc: 0.990661	Evaluation's auc: 0.830097
[4775]	Train's auc: 0.990663	Evaluation's auc: 0.830091
[4776]	Train's auc: 0.990663	Evaluation's auc: 0.8301
[4777]	Train's auc: 0.990663	Evaluation's auc: 0.830101
[4778]	Train's auc: 0.990663	Evaluation's auc: 0.830101
[4779]	Train's auc: 0.990664	Evaluation's auc: 0.830101
[4780]	Train's auc: 0.990663	Evaluation's auc: 0.830112
[4781]	Train's auc: 0.990665	Evaluation's auc: 0.830111
[4782]	Train's auc: 0.990666	Evaluation's auc: 0.830112
[4783]	Train's auc: 0.990667	Evaluation's auc: 0.830109
[4784]	Train's auc: 0.990669	Evaluation's auc: 0.830109
[4785]	Train's auc: 0.990669	Evaluation's auc: 0.830106
[4786]	Train's auc: 0.99067	Evaluation's auc: 0.830104
[4787]	Train's auc: 0.990676	Evaluation's auc: 0.830105
[4788]	Train's auc: 0.990677	Evaluation's auc: 0.830107
[4789]	Train's auc: 0.990677	Evaluation's auc: 0.830107
[4790]	Train's auc: 0.990677	Evaluation's auc: 0.830107
[4791]	Train's auc: 0.99068	Evaluation's auc: 0.830102
[4792]	Train's auc: 0.990679	Evaluation's auc: 0.830101
[4793]	Train's auc: 0.990683	Evaluation's auc: 0.830092
[4794]	Train's auc: 0.990683	Evaluation's auc: 0.830092
[4795]	Train's auc: 0.990682	Evaluation's auc: 0.830109
[4796]	Train's auc: 0.990684	Evaluation's auc: 0.830111
[4797]	Train's auc: 0.990684	Evaluation's auc: 0.830102
[4798]	Train's auc: 0.990683	Evaluation's auc: 0.830099
[4799]	Train's auc: 0.990684	Evaluation's auc: 0.830101
[4800]	Train's auc: 0.990684	Evaluation's auc: 0.830101
[4801]	Train's auc: 0.990684	Evaluation's auc: 0.830105
[4802]	Train's auc: 0.990683	Evaluation's auc: 0.830105
[4803]	Train's auc: 0.990684	Evaluation's auc: 0.830104
[4804]	Train's auc: 0.990684	Evaluation's auc: 0.830104
[4805]	Train's auc: 0.990684	Evaluation's auc: 0.830103
[4806]	Train's auc: 0.990685	Evaluation's auc: 0.830102
[4807]	Train's auc: 0.990684	Evaluation's auc: 0.830102
[4808]	Train's auc: 0.990686	Evaluation's auc: 0.830084
[4809]	Train's auc: 0.990687	Evaluation's auc: 0.830084
[4810]	Train's auc: 0.990687	Evaluation's auc: 0.830084
[4811]	Train's auc: 0.990687	Evaluation's auc: 0.830084
[4812]	Train's auc: 0.990687	Evaluation's auc: 0.830084
[4813]	Train's auc: 0.990686	Evaluation's auc: 0.830103
[4814]	Train's auc: 0.990687	Evaluation's auc: 0.830081
[4815]	Train's auc: 0.990687	Evaluation's auc: 0.830081
[4816]	Train's auc: 0.990687	Evaluation's auc: 0.830081
[4817]	Train's auc: 0.990687	Evaluation's auc: 0.830081
[4818]	Train's auc: 0.990687	Evaluation's auc: 0.830081
[4819]	Train's auc: 0.990687	Evaluation's auc: 0.830081
[4820]	Train's auc: 0.990687	Evaluation's auc: 0.830082
[4821]	Train's auc: 0.990687	Evaluation's auc: 0.830083
[4822]	Train's auc: 0.990687	Evaluation's auc: 0.830083
[4823]	Train's auc: 0.990687	Evaluation's auc: 0.830083
[4824]	Train's auc: 0.990687	Evaluation's auc: 0.830083
[4825]	Train's auc: 0.990687	Evaluation's auc: 0.830083
[4826]	Train's auc: 0.990687	Evaluation's auc: 0.830083
[4827]	Train's auc: 0.990687	Evaluation's auc: 0.830083
[4828]	Train's auc: 0.990687	Evaluation's auc: 0.830083
[4829]	Train's auc: 0.990687	Evaluation's auc: 0.830083
[4830]	Train's auc: 0.990687	Evaluation's auc: 0.830083
[4831]	Train's auc: 0.990687	Evaluation's auc: 0.83008
[4832]	Train's auc: 0.990687	Evaluation's auc: 0.83008
[4833]	Train's auc: 0.990687	Evaluation's auc: 0.83008
[4834]	Train's auc: 0.990687	Evaluation's auc: 0.83008
[4835]	Train's auc: 0.990687	Evaluation's auc: 0.83008
[4836]	Train's auc: 0.990687	Evaluation's auc: 0.83008
[4837]	Train's auc: 0.990687	Evaluation's auc: 0.830095
[4838]	Train's auc: 0.990688	Evaluation's auc: 0.830096
[4839]	Train's auc: 0.990688	Evaluation's auc: 0.830095
[4840]	Train's auc: 0.99069	Evaluation's auc: 0.830098
[4841]	Train's auc: 0.99069	Evaluation's auc: 0.830097
[4842]	Train's auc: 0.990689	Evaluation's auc: 0.830097
[4843]	Train's auc: 0.990689	Evaluation's auc: 0.830092
[4844]	Train's auc: 0.990689	Evaluation's auc: 0.830087
[4845]	Train's auc: 0.990689	Evaluation's auc: 0.830087
[4846]	Train's auc: 0.990689	Evaluation's auc: 0.830087
[4847]	Train's auc: 0.990689	Evaluation's auc: 0.830087
[4848]	Train's auc: 0.990688	Evaluation's auc: 0.830096
[4849]	Train's auc: 0.990689	Evaluation's auc: 0.830095
[4850]	Train's auc: 0.990688	Evaluation's auc: 0.830097
[4851]	Train's auc: 0.990689	Evaluation's auc: 0.830096
[4852]	Train's auc: 0.990689	Evaluation's auc: 0.830096
[4853]	Train's auc: 0.990689	Evaluation's auc: 0.830095
[4854]	Train's auc: 0.990688	Evaluation's auc: 0.83011
[4855]	Train's auc: 0.990688	Evaluation's auc: 0.83011
[4856]	Train's auc: 0.99069	Evaluation's auc: 0.830104
[4857]	Train's auc: 0.990689	Evaluation's auc: 0.830104
[4858]	Train's auc: 0.990691	Evaluation's auc: 0.830097
[4859]	Train's auc: 0.990691	Evaluation's auc: 0.830087
[4860]	Train's auc: 0.990691	Evaluation's auc: 0.830087
[4861]	Train's auc: 0.990691	Evaluation's auc: 0.830086
[4862]	Train's auc: 0.990691	Evaluation's auc: 0.830086
[4863]	Train's auc: 0.990691	Evaluation's auc: 0.830086
[4864]	Train's auc: 0.990691	Evaluation's auc: 0.830085
[4865]	Train's auc: 0.990691	Evaluation's auc: 0.830085
[4866]	Train's auc: 0.990691	Evaluation's auc: 0.830085
[4867]	Train's auc: 0.990691	Evaluation's auc: 0.830085
[4868]	Train's auc: 0.990691	Evaluation's auc: 0.830085
[4869]	Train's auc: 0.990691	Evaluation's auc: 0.830085
[4870]	Train's auc: 0.990691	Evaluation's auc: 0.830085
[4871]	Train's auc: 0.990692	Evaluation's auc: 0.830098
[4872]	Train's auc: 0.990692	Evaluation's auc: 0.830098
[4873]	Train's auc: 0.990692	Evaluation's auc: 0.830098
[4874]	Train's auc: 0.990692	Evaluation's auc: 0.830098
[4875]	Train's auc: 0.990692	Evaluation's auc: 0.830098
[4876]	Train's auc: 0.990692	Evaluation's auc: 0.830098
[4877]	Train's auc: 0.990692	Evaluation's auc: 0.830098
[4878]	Train's auc: 0.990692	Evaluation's auc: 0.830098
[4879]	Train's auc: 0.990692	Evaluation's auc: 0.830098
[4880]	Train's auc: 0.990692	Evaluation's auc: 0.830098
[4881]	Train's auc: 0.990692	Evaluation's auc: 0.830095
[4882]	Train's auc: 0.990692	Evaluation's auc: 0.830095
[4883]	Train's auc: 0.990692	Evaluation's auc: 0.830095
[4884]	Train's auc: 0.990692	Evaluation's auc: 0.830095
[4885]	Train's auc: 0.990692	Evaluation's auc: 0.830095
[4886]	Train's auc: 0.990692	Evaluation's auc: 0.830095
[4887]	Train's auc: 0.990692	Evaluation's auc: 0.830095
[4888]	Train's auc: 0.990692	Evaluation's auc: 0.830095
[4889]	Train's auc: 0.990692	Evaluation's auc: 0.830095
[4890]	Train's auc: 0.990692	Evaluation's auc: 0.830095
[4891]	Train's auc: 0.990692	Evaluation's auc: 0.830095
[4892]	Train's auc: 0.990692	Evaluation's auc: 0.830095
[4893]	Train's auc: 0.990692	Evaluation's auc: 0.830095
[4894]	Train's auc: 0.990692	Evaluation's auc: 0.830095
[4895]	Train's auc: 0.990692	Evaluation's auc: 0.830095
[4896]	Train's auc: 0.990693	Evaluation's auc: 0.830105
[4897]	Train's auc: 0.990692	Evaluation's auc: 0.830106
[4898]	Train's auc: 0.990693	Evaluation's auc: 0.830107
[4899]	Train's auc: 0.990693	Evaluation's auc: 0.830107
[4900]	Train's auc: 0.990693	Evaluation's auc: 0.830107
[4901]	Train's auc: 0.990693	Evaluation's auc: 0.830107
[4902]	Train's auc: 0.990693	Evaluation's auc: 0.830104
[4903]	Train's auc: 0.990693	Evaluation's auc: 0.830103
[4904]	Train's auc: 0.990693	Evaluation's auc: 0.830104
[4905]	Train's auc: 0.990693	Evaluation's auc: 0.830104
[4906]	Train's auc: 0.990693	Evaluation's auc: 0.830104
[4907]	Train's auc: 0.990693	Evaluation's auc: 0.830104
[4908]	Train's auc: 0.990693	Evaluation's auc: 0.830106
[4909]	Train's auc: 0.990693	Evaluation's auc: 0.830106
[4910]	Train's auc: 0.990693	Evaluation's auc: 0.830106
[4911]	Train's auc: 0.990693	Evaluation's auc: 0.830106
[4912]	Train's auc: 0.990693	Evaluation's auc: 0.830106
[4913]	Train's auc: 0.990693	Evaluation's auc: 0.830106
[4914]	Train's auc: 0.990693	Evaluation's auc: 0.830106
[4915]	Train's auc: 0.990693	Evaluation's auc: 0.830106
[4916]	Train's auc: 0.990693	Evaluation's auc: 0.830106
[4917]	Train's auc: 0.990693	Evaluation's auc: 0.830106
[4918]	Train's auc: 0.990694	Evaluation's auc: 0.830105
[4919]	Train's auc: 0.990694	Evaluation's auc: 0.830105
[4920]	Train's auc: 0.990692	Evaluation's auc: 0.830113
[4921]	Train's auc: 0.990694	Evaluation's auc: 0.830118
[4922]	Train's auc: 0.990695	Evaluation's auc: 0.830109
[4923]	Train's auc: 0.990696	Evaluation's auc: 0.830114
[4924]	Train's auc: 0.990696	Evaluation's auc: 0.830107
[4925]	Train's auc: 0.990696	Evaluation's auc: 0.830107
[4926]	Train's auc: 0.990696	Evaluation's auc: 0.830107
[4927]	Train's auc: 0.990696	Evaluation's auc: 0.830108
[4928]	Train's auc: 0.990696	Evaluation's auc: 0.830108
[4929]	Train's auc: 0.990696	Evaluation's auc: 0.830108
[4930]	Train's auc: 0.990696	Evaluation's auc: 0.830108
[4931]	Train's auc: 0.990695	Evaluation's auc: 0.830109
[4932]	Train's auc: 0.990695	Evaluation's auc: 0.830111
[4933]	Train's auc: 0.990695	Evaluation's auc: 0.830111
[4934]	Train's auc: 0.990695	Evaluation's auc: 0.830111
[4935]	Train's auc: 0.990695	Evaluation's auc: 0.830111
[4936]	Train's auc: 0.990695	Evaluation's auc: 0.830113
[4937]	Train's auc: 0.990695	Evaluation's auc: 0.830113
[4938]	Train's auc: 0.990695	Evaluation's auc: 0.830112
[4939]	Train's auc: 0.990695	Evaluation's auc: 0.830112
[4940]	Train's auc: 0.990695	Evaluation's auc: 0.830112
[4941]	Train's auc: 0.990695	Evaluation's auc: 0.830112
[4942]	Train's auc: 0.990693	Evaluation's auc: 0.830112
[4943]	Train's auc: 0.990694	Evaluation's auc: 0.830112
[4944]	Train's auc: 0.990697	Evaluation's auc: 0.830113
[4945]	Train's auc: 0.990698	Evaluation's auc: 0.830112
[4946]	Train's auc: 0.990703	Evaluation's auc: 0.830111
[4947]	Train's auc: 0.990702	Evaluation's auc: 0.830115
[4948]	Train's auc: 0.990705	Evaluation's auc: 0.830103
[4949]	Train's auc: 0.990705	Evaluation's auc: 0.830101
[4950]	Train's auc: 0.990706	Evaluation's auc: 0.830096
[4951]	Train's auc: 0.990708	Evaluation's auc: 0.830098
[4952]	Train's auc: 0.99071	Evaluation's auc: 0.830063
[4953]	Train's auc: 0.990711	Evaluation's auc: 0.830064
[4954]	Train's auc: 0.990711	Evaluation's auc: 0.830065
[4955]	Train's auc: 0.990712	Evaluation's auc: 0.830061
[4956]	Train's auc: 0.990712	Evaluation's auc: 0.830064
[4957]	Train's auc: 0.990714	Evaluation's auc: 0.830061
[4958]	Train's auc: 0.99072	Evaluation's auc: 0.830059
[4959]	Train's auc: 0.990722	Evaluation's auc: 0.830047
[4960]	Train's auc: 0.990722	Evaluation's auc: 0.830047
[4961]	Train's auc: 0.990723	Evaluation's auc: 0.830047
[4962]	Train's auc: 0.990723	Evaluation's auc: 0.830047
[4963]	Train's auc: 0.990723	Evaluation's auc: 0.830047
[4964]	Train's auc: 0.990723	Evaluation's auc: 0.830047
[4965]	Train's auc: 0.990722	Evaluation's auc: 0.830048
[4966]	Train's auc: 0.990729	Evaluation's auc: 0.830121
[4967]	Train's auc: 0.990729	Evaluation's auc: 0.83012
[4968]	Train's auc: 0.99073	Evaluation's auc: 0.830119
[4969]	Train's auc: 0.99073	Evaluation's auc: 0.830119
[4970]	Train's auc: 0.990729	Evaluation's auc: 0.830121
[4971]	Train's auc: 0.990732	Evaluation's auc: 0.830131
[4972]	Train's auc: 0.990733	Evaluation's auc: 0.830116
[4973]	Train's auc: 0.990733	Evaluation's auc: 0.830116
[4974]	Train's auc: 0.990733	Evaluation's auc: 0.830115
[4975]	Train's auc: 0.990733	Evaluation's auc: 0.830114
[4976]	Train's auc: 0.990733	Evaluation's auc: 0.830122
[4977]	Train's auc: 0.990735	Evaluation's auc: 0.830122
[4978]	Train's auc: 0.990738	Evaluation's auc: 0.830137
[4979]	Train's auc: 0.990744	Evaluation's auc: 0.830124
[4980]	Train's auc: 0.990742	Evaluation's auc: 0.830137
[4981]	Train's auc: 0.990743	Evaluation's auc: 0.830123
[4982]	Train's auc: 0.990748	Evaluation's auc: 0.830118
[4983]	Train's auc: 0.990753	Evaluation's auc: 0.830108
[4984]	Train's auc: 0.990756	Evaluation's auc: 0.83011
[4985]	Train's auc: 0.990755	Evaluation's auc: 0.830112
[4986]	Train's auc: 0.990755	Evaluation's auc: 0.830113
[4987]	Train's auc: 0.990755	Evaluation's auc: 0.830112
[4988]	Train's auc: 0.990756	Evaluation's auc: 0.830095
[4989]	Train's auc: 0.990757	Evaluation's auc: 0.830089
[4990]	Train's auc: 0.990756	Evaluation's auc: 0.83007
[4991]	Train's auc: 0.990756	Evaluation's auc: 0.830068
[4992]	Train's auc: 0.990756	Evaluation's auc: 0.830068
[4993]	Train's auc: 0.990756	Evaluation's auc: 0.830068
[4994]	Train's auc: 0.990755	Evaluation's auc: 0.830071
[4995]	Train's auc: 0.990755	Evaluation's auc: 0.83007
[4996]	Train's auc: 0.990755	Evaluation's auc: 0.83007
[4997]	Train's auc: 0.990755	Evaluation's auc: 0.830072
[4998]	Train's auc: 0.990755	Evaluation's auc: 0.830071
[4999]	Train's auc: 0.990756	Evaluation's auc: 0.830068
[5000]	Train's auc: 0.990758	Evaluation's auc: 0.830068

Using Hyperopt also we are getting highly overfit results.

Train AUC: 0.990758

Test AUC: 0.830068

Gridsearch LightGBM

In [90]:
from sklearn.model_selection import GridSearchCV
In [91]:
lgb_grid_params = {
    'boosting_type':['dart','gbdt'],
    'learning_rate': [0.005, 0.01, 0.0001],
    'n_estimators' : [3, 5],
    'num_leaves': [12,16,20],
    'random_state' : [501]
    # Updated from 'seed'
    #'colsample_bytree' : [???],
    #'subsample' : [???],
    #'reg_alpha' : [???],
    #'reg_lambda' : [???],
    }
In [92]:
# We can include early stopping with gradient boosted decisino trees:

fit_params={"early_stopping_rounds":10,
            "eval_set" : [[x_val, y_val]]}
In [93]:
lgb_mdl_gs = lgb.LGBMClassifier(boosting_type = 'gbdt',
                                n_jobs = -1,
                                objective = 'binary',
                                num_iterations = 100,
                                metric = 'auc'
                               )
In [94]:
lgb_grid_mdl = GridSearchCV(lgb_mdl_gs, lgb_grid_params,
                    verbose=0,
                    cv=4)
In [95]:
lgb_grid_mdl
Out[95]:
GridSearchCV(cv=4, error_score=nan,
             estimator=LGBMClassifier(boosting_type='gbdt', class_weight=None,
                                      colsample_bytree=1.0,
                                      importance_type='split',
                                      learning_rate=0.1, max_depth=-1,
                                      metric='auc', min_child_samples=20,
                                      min_child_weight=0.001,
                                      min_split_gain=0.0, n_estimators=100,
                                      n_jobs=-1, num_iterations=100,
                                      num_leaves=31, objective='binary',
                                      random_state=None, reg...ha=0.0,
                                      reg_lambda=0.0, silent=True,
                                      subsample=1.0, subsample_for_bin=200000,
                                      subsample_freq=0),
             iid='deprecated', n_jobs=None,
             param_grid={'boosting_type': ['dart', 'gbdt'],
                         'learning_rate': [0.005, 0.01, 0.0001],
                         'n_estimators': [3, 5], 'num_leaves': [12, 16, 20],
                         'random_state': [501]},
             pre_dispatch='2*n_jobs', refit=True, return_train_score=False,
             scoring=None, verbose=0)
In [96]:
lgb_grid_mdl.get_params().keys()
Out[96]:
dict_keys(['cv', 'error_score', 'estimator__boosting_type', 'estimator__class_weight', 'estimator__colsample_bytree', 'estimator__importance_type', 'estimator__learning_rate', 'estimator__max_depth', 'estimator__min_child_samples', 'estimator__min_child_weight', 'estimator__min_split_gain', 'estimator__n_estimators', 'estimator__n_jobs', 'estimator__num_leaves', 'estimator__objective', 'estimator__random_state', 'estimator__reg_alpha', 'estimator__reg_lambda', 'estimator__silent', 'estimator__subsample', 'estimator__subsample_for_bin', 'estimator__subsample_freq', 'estimator__num_iterations', 'estimator__metric', 'estimator', 'iid', 'n_jobs', 'param_grid', 'pre_dispatch', 'refit', 'return_train_score', 'scoring', 'verbose'])
In [97]:
lgb_grid_mdl.fit(x_train, y_train, **fit_params)
/Users/piumallick/anaconda3/lib/python3.7/site-packages/lightgbm/engine.py:148: UserWarning: Found `num_iterations` in params. Will use it instead of argument
  warnings.warn("Found `{}` in params. Will use it instead of argument".format(alias))
[1]	valid_0's auc: 0.7495
[2]	valid_0's auc: 0.7495
[3]	valid_0's auc: 0.7495
[4]	valid_0's auc: 0.7495
[5]	valid_0's auc: 0.7495
[6]	valid_0's auc: 0.7495
[7]	valid_0's auc: 0.7495
[8]	valid_0's auc: 0.751338
[9]	valid_0's auc: 0.751338
[10]	valid_0's auc: 0.751078
[11]	valid_0's auc: 0.751078
[12]	valid_0's auc: 0.751338
[13]	valid_0's auc: 0.751338
/Users/piumallick/anaconda3/lib/python3.7/site-packages/lightgbm/callback.py:192: UserWarning: Early stopping is not available in dart mode
  warnings.warn('Early stopping is not available in dart mode')
[14]	valid_0's auc: 0.751338
[15]	valid_0's auc: 0.751078
[16]	valid_0's auc: 0.751078
[17]	valid_0's auc: 0.749299
[18]	valid_0's auc: 0.749299
[19]	valid_0's auc: 0.749299
[20]	valid_0's auc: 0.749299
[21]	valid_0's auc: 0.749299
[22]	valid_0's auc: 0.751078
[23]	valid_0's auc: 0.751078
[24]	valid_0's auc: 0.753588
[25]	valid_0's auc: 0.753588
[26]	valid_0's auc: 0.753588
[27]	valid_0's auc: 0.752608
[28]	valid_0's auc: 0.752634
[29]	valid_0's auc: 0.752634
[30]	valid_0's auc: 0.752634
[31]	valid_0's auc: 0.753588
[32]	valid_0's auc: 0.753588
[33]	valid_0's auc: 0.752608
[34]	valid_0's auc: 0.752608
[35]	valid_0's auc: 0.752634
[36]	valid_0's auc: 0.753663
[37]	valid_0's auc: 0.753588
[38]	valid_0's auc: 0.753817
[39]	valid_0's auc: 0.753817
[40]	valid_0's auc: 0.753801
[41]	valid_0's auc: 0.753622
[42]	valid_0's auc: 0.754569
[43]	valid_0's auc: 0.754617
[44]	valid_0's auc: 0.754617
[45]	valid_0's auc: 0.754608
[46]	valid_0's auc: 0.754558
[47]	valid_0's auc: 0.754558
[48]	valid_0's auc: 0.754558
[49]	valid_0's auc: 0.754628
[50]	valid_0's auc: 0.754558
[51]	valid_0's auc: 0.754579
[52]	valid_0's auc: 0.754617
[53]	valid_0's auc: 0.754621
[54]	valid_0's auc: 0.75465
[55]	valid_0's auc: 0.756121
[56]	valid_0's auc: 0.75612
[57]	valid_0's auc: 0.756265
[58]	valid_0's auc: 0.756265
[59]	valid_0's auc: 0.756265
[60]	valid_0's auc: 0.756283
[61]	valid_0's auc: 0.756283
[62]	valid_0's auc: 0.756245
[63]	valid_0's auc: 0.756262
[64]	valid_0's auc: 0.756262
[65]	valid_0's auc: 0.756285
[66]	valid_0's auc: 0.756247
[67]	valid_0's auc: 0.756247
[68]	valid_0's auc: 0.756285
[69]	valid_0's auc: 0.756247
[70]	valid_0's auc: 0.756247
[71]	valid_0's auc: 0.756247
[72]	valid_0's auc: 0.756247
[73]	valid_0's auc: 0.756247
[74]	valid_0's auc: 0.756247
[75]	valid_0's auc: 0.756247
[76]	valid_0's auc: 0.756246
[77]	valid_0's auc: 0.7563
[78]	valid_0's auc: 0.756268
[79]	valid_0's auc: 0.756322
[80]	valid_0's auc: 0.756322
[81]	valid_0's auc: 0.762092
[82]	valid_0's auc: 0.762097
[83]	valid_0's auc: 0.762082
[84]	valid_0's auc: 0.762117
[85]	valid_0's auc: 0.762138
[86]	valid_0's auc: 0.762142
[87]	valid_0's auc: 0.762142
[88]	valid_0's auc: 0.762142
[89]	valid_0's auc: 0.762133
[90]	valid_0's auc: 0.762127
[91]	valid_0's auc: 0.762118
[92]	valid_0's auc: 0.762166
[93]	valid_0's auc: 0.762127
[94]	valid_0's auc: 0.762127
[95]	valid_0's auc: 0.762173
[96]	valid_0's auc: 0.762131
[97]	valid_0's auc: 0.762127
[98]	valid_0's auc: 0.762461
[99]	valid_0's auc: 0.762457
[100]	valid_0's auc: 0.762457
[1]	valid_0's auc: 0.748453
[2]	valid_0's auc: 0.749691
[3]	valid_0's auc: 0.751615
[4]	valid_0's auc: 0.751094
[5]	valid_0's auc: 0.751119
[6]	valid_0's auc: 0.752006
[7]	valid_0's auc: 0.752006
[8]	valid_0's auc: 0.751982
[9]	valid_0's auc: 0.752532
[10]	valid_0's auc: 0.756575
[11]	valid_0's auc: 0.756658
[12]	valid_0's auc: 0.756658
[13]	valid_0's auc: 0.756638
[14]	valid_0's auc: 0.756577
[15]	valid_0's auc: 0.756643
[16]	valid_0's auc: 0.756819
[17]	valid_0's auc: 0.756629
[18]	valid_0's auc: 0.756785
[19]	valid_0's auc: 0.756519
[20]	valid_0's auc: 0.756519
[21]	valid_0's auc: 0.756542
[22]	valid_0's auc: 0.756415
[23]	valid_0's auc: 0.756415
[24]	valid_0's auc: 0.756995
[25]	valid_0's auc: 0.757296
[26]	valid_0's auc: 0.757296
[27]	valid_0's auc: 0.756938
[28]	valid_0's auc: 0.756638
[29]	valid_0's auc: 0.756657
[30]	valid_0's auc: 0.756656
[31]	valid_0's auc: 0.756939
[32]	valid_0's auc: 0.756922
[33]	valid_0's auc: 0.756658
[34]	valid_0's auc: 0.756658
[35]	valid_0's auc: 0.75676
[36]	valid_0's auc: 0.756836
[37]	valid_0's auc: 0.758193
[38]	valid_0's auc: 0.761837
[39]	valid_0's auc: 0.761771
[40]	valid_0's auc: 0.76182
[41]	valid_0's auc: 0.761735
[42]	valid_0's auc: 0.762922
[43]	valid_0's auc: 0.76256
[44]	valid_0's auc: 0.762561
[45]	valid_0's auc: 0.762974
[46]	valid_0's auc: 0.763946
[47]	valid_0's auc: 0.763934
[48]	valid_0's auc: 0.765772
[49]	valid_0's auc: 0.766153
[50]	valid_0's auc: 0.765867
[51]	valid_0's auc: 0.766182
[52]	valid_0's auc: 0.766029
[53]	valid_0's auc: 0.766586
[54]	valid_0's auc: 0.766755
[55]	valid_0's auc: 0.766879
[56]	valid_0's auc: 0.766713
[57]	valid_0's auc: 0.76723
[58]	valid_0's auc: 0.767224
[59]	valid_0's auc: 0.767274
[60]	valid_0's auc: 0.76708
[61]	valid_0's auc: 0.767084
[62]	valid_0's auc: 0.767316
[63]	valid_0's auc: 0.768125
[64]	valid_0's auc: 0.768149
[65]	valid_0's auc: 0.768159
[66]	valid_0's auc: 0.768155
[67]	valid_0's auc: 0.767995
[68]	valid_0's auc: 0.767966
[69]	valid_0's auc: 0.768898
[70]	valid_0's auc: 0.768752
[71]	valid_0's auc: 0.768493
[72]	valid_0's auc: 0.768935
[73]	valid_0's auc: 0.768953
[74]	valid_0's auc: 0.76896
[75]	valid_0's auc: 0.768928
[76]	valid_0's auc: 0.769088
[77]	valid_0's auc: 0.769044
[78]	valid_0's auc: 0.769472
[79]	valid_0's auc: 0.769704
[80]	valid_0's auc: 0.770133
[81]	valid_0's auc: 0.770279
[82]	valid_0's auc: 0.77036
[83]	valid_0's auc: 0.770189
[84]	valid_0's auc: 0.770399
[85]	valid_0's auc: 0.770458
[86]	valid_0's auc: 0.770468
[87]	valid_0's auc: 0.770465
[88]	valid_0's auc: 0.770445
[89]	valid_0's auc: 0.770423
[90]	valid_0's auc: 0.770407
[91]	valid_0's auc: 0.770374
[92]	valid_0's auc: 0.770488
[93]	valid_0's auc: 0.770414
[94]	valid_0's auc: 0.770352
[95]	valid_0's auc: 0.770421
[96]	valid_0's auc: 0.770474
[97]	valid_0's auc: 0.770517
[98]	valid_0's auc: 0.770554
[99]	valid_0's auc: 0.770537
[100]	valid_0's auc: 0.770578
[1]	valid_0's auc: 0.748867
[2]	valid_0's auc: 0.762521
[3]	valid_0's auc: 0.764062
[4]	valid_0's auc: 0.763258
[5]	valid_0's auc: 0.763258
[6]	valid_0's auc: 0.762373
[7]	valid_0's auc: 0.762373
[8]	valid_0's auc: 0.762343
[9]	valid_0's auc: 0.764046
[10]	valid_0's auc: 0.763356
[11]	valid_0's auc: 0.763356
[12]	valid_0's auc: 0.763956
[13]	valid_0's auc: 0.764331
[14]	valid_0's auc: 0.763847
[15]	valid_0's auc: 0.763082
[16]	valid_0's auc: 0.762994
[17]	valid_0's auc: 0.763904
[18]	valid_0's auc: 0.764023
[19]	valid_0's auc: 0.764023
[20]	valid_0's auc: 0.764023
[21]	valid_0's auc: 0.763919
[22]	valid_0's auc: 0.76395
[23]	valid_0's auc: 0.763922
[24]	valid_0's auc: 0.76392
[25]	valid_0's auc: 0.763752
[26]	valid_0's auc: 0.764454
[27]	valid_0's auc: 0.76508
[28]	valid_0's auc: 0.764877
[29]	valid_0's auc: 0.764979
[30]	valid_0's auc: 0.764998
[31]	valid_0's auc: 0.764434
[32]	valid_0's auc: 0.764837
[33]	valid_0's auc: 0.765323
[34]	valid_0's auc: 0.76526
[35]	valid_0's auc: 0.765255
[36]	valid_0's auc: 0.765941
[37]	valid_0's auc: 0.765249
[38]	valid_0's auc: 0.766137
[39]	valid_0's auc: 0.766162
[40]	valid_0's auc: 0.765359
[41]	valid_0's auc: 0.76535
[42]	valid_0's auc: 0.765736
[43]	valid_0's auc: 0.766502
[44]	valid_0's auc: 0.766498
[45]	valid_0's auc: 0.76613
[46]	valid_0's auc: 0.766363
[47]	valid_0's auc: 0.766347
[48]	valid_0's auc: 0.766244
[49]	valid_0's auc: 0.766738
[50]	valid_0's auc: 0.766597
[51]	valid_0's auc: 0.767732
[52]	valid_0's auc: 0.767454
[53]	valid_0's auc: 0.767521
[54]	valid_0's auc: 0.767431
[55]	valid_0's auc: 0.767401
[56]	valid_0's auc: 0.768159
[57]	valid_0's auc: 0.767965
[58]	valid_0's auc: 0.768016
[59]	valid_0's auc: 0.767889
[60]	valid_0's auc: 0.767804
[61]	valid_0's auc: 0.767866
[62]	valid_0's auc: 0.768152
[63]	valid_0's auc: 0.767838
[64]	valid_0's auc: 0.767946
[65]	valid_0's auc: 0.767746
[66]	valid_0's auc: 0.767741
[67]	valid_0's auc: 0.767787
[68]	valid_0's auc: 0.767848
[69]	valid_0's auc: 0.767646
[70]	valid_0's auc: 0.768215
[71]	valid_0's auc: 0.768195
[72]	valid_0's auc: 0.768186
[73]	valid_0's auc: 0.768187
[74]	valid_0's auc: 0.768195
[75]	valid_0's auc: 0.768135
[76]	valid_0's auc: 0.768456
[77]	valid_0's auc: 0.768422
[78]	valid_0's auc: 0.768505
[79]	valid_0's auc: 0.768409
[80]	valid_0's auc: 0.768388
[81]	valid_0's auc: 0.768728
[82]	valid_0's auc: 0.768667
[83]	valid_0's auc: 0.76866
[84]	valid_0's auc: 0.768468
[85]	valid_0's auc: 0.768719
[86]	valid_0's auc: 0.768696
[87]	valid_0's auc: 0.768676
[88]	valid_0's auc: 0.768684
[89]	valid_0's auc: 0.768693
[90]	valid_0's auc: 0.768666
[91]	valid_0's auc: 0.76873
[92]	valid_0's auc: 0.768665
[93]	valid_0's auc: 0.76916
[94]	valid_0's auc: 0.769141
[95]	valid_0's auc: 0.769077
[96]	valid_0's auc: 0.768722
[97]	valid_0's auc: 0.768731
[98]	valid_0's auc: 0.769051
[99]	valid_0's auc: 0.769084
[100]	valid_0's auc: 0.768879
[1]	valid_0's auc: 0.748286
[2]	valid_0's auc: 0.748286
[3]	valid_0's auc: 0.749259
[4]	valid_0's auc: 0.749362
[5]	valid_0's auc: 0.749362
[6]	valid_0's auc: 0.749362
[7]	valid_0's auc: 0.749362
[8]	valid_0's auc: 0.749483
[9]	valid_0's auc: 0.74936
[10]	valid_0's auc: 0.749361
[11]	valid_0's auc: 0.749362
[12]	valid_0's auc: 0.749509
[13]	valid_0's auc: 0.757395
[14]	valid_0's auc: 0.757131
[15]	valid_0's auc: 0.757192
[16]	valid_0's auc: 0.757179
[17]	valid_0's auc: 0.759842
[18]	valid_0's auc: 0.759842
[19]	valid_0's auc: 0.759842
[20]	valid_0's auc: 0.759842
[21]	valid_0's auc: 0.759829
[22]	valid_0's auc: 0.759813
[23]	valid_0's auc: 0.759844
[24]	valid_0's auc: 0.759267
[25]	valid_0's auc: 0.760772
[26]	valid_0's auc: 0.760956
[27]	valid_0's auc: 0.761568
[28]	valid_0's auc: 0.762757
[29]	valid_0's auc: 0.762609
[30]	valid_0's auc: 0.762378
[31]	valid_0's auc: 0.762651
[32]	valid_0's auc: 0.762598
[33]	valid_0's auc: 0.76328
[34]	valid_0's auc: 0.7635
[35]	valid_0's auc: 0.763434
[36]	valid_0's auc: 0.763371
[37]	valid_0's auc: 0.763524
[38]	valid_0's auc: 0.763325
[39]	valid_0's auc: 0.76332
[40]	valid_0's auc: 0.763144
[41]	valid_0's auc: 0.763069
[42]	valid_0's auc: 0.763788
[43]	valid_0's auc: 0.764103
[44]	valid_0's auc: 0.764047
[45]	valid_0's auc: 0.764262
[46]	valid_0's auc: 0.764678
[47]	valid_0's auc: 0.764708
[48]	valid_0's auc: 0.764593
[49]	valid_0's auc: 0.765004
[50]	valid_0's auc: 0.764929
[51]	valid_0's auc: 0.765181
[52]	valid_0's auc: 0.766236
[53]	valid_0's auc: 0.766075
[54]	valid_0's auc: 0.765311
[55]	valid_0's auc: 0.766189
[56]	valid_0's auc: 0.76613
[57]	valid_0's auc: 0.766162
[58]	valid_0's auc: 0.766052
[59]	valid_0's auc: 0.766053
[60]	valid_0's auc: 0.766352
[61]	valid_0's auc: 0.766448
[62]	valid_0's auc: 0.766314
[63]	valid_0's auc: 0.76624
[64]	valid_0's auc: 0.766245
[65]	valid_0's auc: 0.766094
[66]	valid_0's auc: 0.766201
[67]	valid_0's auc: 0.766238
[68]	valid_0's auc: 0.766368
[69]	valid_0's auc: 0.766113
[70]	valid_0's auc: 0.766534
[71]	valid_0's auc: 0.766537
[72]	valid_0's auc: 0.766421
[73]	valid_0's auc: 0.766394
[74]	valid_0's auc: 0.766421
[75]	valid_0's auc: 0.76638
[76]	valid_0's auc: 0.766329
[77]	valid_0's auc: 0.766692
[78]	valid_0's auc: 0.766606
[79]	valid_0's auc: 0.766631
[80]	valid_0's auc: 0.766506
[81]	valid_0's auc: 0.766782
[82]	valid_0's auc: 0.766681
[83]	valid_0's auc: 0.766804
[84]	valid_0's auc: 0.766541
[85]	valid_0's auc: 0.767484
[86]	valid_0's auc: 0.767418
[87]	valid_0's auc: 0.767501
[88]	valid_0's auc: 0.767382
[89]	valid_0's auc: 0.767382
[90]	valid_0's auc: 0.767385
[91]	valid_0's auc: 0.767523
[92]	valid_0's auc: 0.767353
[93]	valid_0's auc: 0.76712
[94]	valid_0's auc: 0.767102
[95]	valid_0's auc: 0.767498
[96]	valid_0's auc: 0.76749
[97]	valid_0's auc: 0.767409
[98]	valid_0's auc: 0.767268
[99]	valid_0's auc: 0.767245
[100]	valid_0's auc: 0.76742
[1]	valid_0's auc: 0.757043
[2]	valid_0's auc: 0.757043
[3]	valid_0's auc: 0.757043
[4]	valid_0's auc: 0.757043
[5]	valid_0's auc: 0.757043
[6]	valid_0's auc: 0.757043
[7]	valid_0's auc: 0.757043
[8]	valid_0's auc: 0.757043
[9]	valid_0's auc: 0.757043
[10]	valid_0's auc: 0.757043
[11]	valid_0's auc: 0.757043
[12]	valid_0's auc: 0.757043
[13]	valid_0's auc: 0.757043
[14]	valid_0's auc: 0.757043
[15]	valid_0's auc: 0.757043
[16]	valid_0's auc: 0.757043
[17]	valid_0's auc: 0.757043
[18]	valid_0's auc: 0.757043
[19]	valid_0's auc: 0.757043
[20]	valid_0's auc: 0.757043
[21]	valid_0's auc: 0.757043
[22]	valid_0's auc: 0.757043
[23]	valid_0's auc: 0.757043
[24]	valid_0's auc: 0.758625
[25]	valid_0's auc: 0.759423
[26]	valid_0's auc: 0.760099
[27]	valid_0's auc: 0.760372
[28]	valid_0's auc: 0.760721
[29]	valid_0's auc: 0.760723
[30]	valid_0's auc: 0.760721
[31]	valid_0's auc: 0.760638
[32]	valid_0's auc: 0.760638
[33]	valid_0's auc: 0.760787
[34]	valid_0's auc: 0.760774
[35]	valid_0's auc: 0.760767
[36]	valid_0's auc: 0.760672
[37]	valid_0's auc: 0.760837
[38]	valid_0's auc: 0.760877
[39]	valid_0's auc: 0.760878
[40]	valid_0's auc: 0.761056
[41]	valid_0's auc: 0.761056
[42]	valid_0's auc: 0.760948
[43]	valid_0's auc: 0.761049
[44]	valid_0's auc: 0.761041
[45]	valid_0's auc: 0.760916
[46]	valid_0's auc: 0.760889
[47]	valid_0's auc: 0.760899
[48]	valid_0's auc: 0.760859
[49]	valid_0's auc: 0.760945
[50]	valid_0's auc: 0.760828
[51]	valid_0's auc: 0.760851
[52]	valid_0's auc: 0.761384
[53]	valid_0's auc: 0.761398
[54]	valid_0's auc: 0.761498
[55]	valid_0's auc: 0.761464
[56]	valid_0's auc: 0.761569
[57]	valid_0's auc: 0.761727
[58]	valid_0's auc: 0.761733
[59]	valid_0's auc: 0.761743
[60]	valid_0's auc: 0.761746
[61]	valid_0's auc: 0.761744
[62]	valid_0's auc: 0.76204
[63]	valid_0's auc: 0.762099
[64]	valid_0's auc: 0.762097
[65]	valid_0's auc: 0.7623
[66]	valid_0's auc: 0.762314
[67]	valid_0's auc: 0.762316
[68]	valid_0's auc: 0.762281
[69]	valid_0's auc: 0.762411
[70]	valid_0's auc: 0.762074
[71]	valid_0's auc: 0.762073
[72]	valid_0's auc: 0.762089
[73]	valid_0's auc: 0.762087
[74]	valid_0's auc: 0.762067
[75]	valid_0's auc: 0.762067
[76]	valid_0's auc: 0.762115
[77]	valid_0's auc: 0.76199
[78]	valid_0's auc: 0.762443
[79]	valid_0's auc: 0.762179
[80]	valid_0's auc: 0.762287
[81]	valid_0's auc: 0.762479
[82]	valid_0's auc: 0.762476
[83]	valid_0's auc: 0.762475
[84]	valid_0's auc: 0.762384
[85]	valid_0's auc: 0.762212
[86]	valid_0's auc: 0.76221
[87]	valid_0's auc: 0.76221
[88]	valid_0's auc: 0.762148
[89]	valid_0's auc: 0.762149
[90]	valid_0's auc: 0.762396
[91]	valid_0's auc: 0.762631
[92]	valid_0's auc: 0.762333
[93]	valid_0's auc: 0.762636
[94]	valid_0's auc: 0.762636
[95]	valid_0's auc: 0.762675
[96]	valid_0's auc: 0.762796
[97]	valid_0's auc: 0.762798
[98]	valid_0's auc: 0.762913
[99]	valid_0's auc: 0.762842
[100]	valid_0's auc: 0.762814
[1]	valid_0's auc: 0.752298
[2]	valid_0's auc: 0.759276
[3]	valid_0's auc: 0.759067
[4]	valid_0's auc: 0.759079
[5]	valid_0's auc: 0.759079
[6]	valid_0's auc: 0.759532
[7]	valid_0's auc: 0.759532
[8]	valid_0's auc: 0.760022
[9]	valid_0's auc: 0.760088
[10]	valid_0's auc: 0.761088
[11]	valid_0's auc: 0.76118
[12]	valid_0's auc: 0.761204
[13]	valid_0's auc: 0.761241
[14]	valid_0's auc: 0.761636
[15]	valid_0's auc: 0.762392
[16]	valid_0's auc: 0.762519
[17]	valid_0's auc: 0.762632
[18]	valid_0's auc: 0.762461
[19]	valid_0's auc: 0.76206
[20]	valid_0's auc: 0.76206
[21]	valid_0's auc: 0.762252
[22]	valid_0's auc: 0.762043
[23]	valid_0's auc: 0.762071
[24]	valid_0's auc: 0.761918
[25]	valid_0's auc: 0.762181
[26]	valid_0's auc: 0.762143
[27]	valid_0's auc: 0.762652
[28]	valid_0's auc: 0.762273
[29]	valid_0's auc: 0.762258
[30]	valid_0's auc: 0.762682
[31]	valid_0's auc: 0.762936
[32]	valid_0's auc: 0.762906
[33]	valid_0's auc: 0.762494
[34]	valid_0's auc: 0.762525
[35]	valid_0's auc: 0.763031
[36]	valid_0's auc: 0.763035
[37]	valid_0's auc: 0.764186
[38]	valid_0's auc: 0.764316
[39]	valid_0's auc: 0.764316
[40]	valid_0's auc: 0.764385
[41]	valid_0's auc: 0.764265
[42]	valid_0's auc: 0.764223
[43]	valid_0's auc: 0.764279
[44]	valid_0's auc: 0.764414
[45]	valid_0's auc: 0.764437
[46]	valid_0's auc: 0.764763
[47]	valid_0's auc: 0.764756
[48]	valid_0's auc: 0.764393
[49]	valid_0's auc: 0.764862
[50]	valid_0's auc: 0.764694
[51]	valid_0's auc: 0.764903
[52]	valid_0's auc: 0.765013
[53]	valid_0's auc: 0.765105
[54]	valid_0's auc: 0.765277
[55]	valid_0's auc: 0.766446
[56]	valid_0's auc: 0.766831
[57]	valid_0's auc: 0.769933
[58]	valid_0's auc: 0.769906
[59]	valid_0's auc: 0.769609
[60]	valid_0's auc: 0.769994
[61]	valid_0's auc: 0.769815
[62]	valid_0's auc: 0.770042
[63]	valid_0's auc: 0.771214
[64]	valid_0's auc: 0.771252
[65]	valid_0's auc: 0.771431
[66]	valid_0's auc: 0.771427
[67]	valid_0's auc: 0.771581
[68]	valid_0's auc: 0.771358
[69]	valid_0's auc: 0.771738
[70]	valid_0's auc: 0.771727
[71]	valid_0's auc: 0.771701
[72]	valid_0's auc: 0.771878
[73]	valid_0's auc: 0.771759
[74]	valid_0's auc: 0.77178
[75]	valid_0's auc: 0.771755
[76]	valid_0's auc: 0.772198
[77]	valid_0's auc: 0.772287
[78]	valid_0's auc: 0.77261
[79]	valid_0's auc: 0.77281
[80]	valid_0's auc: 0.77312
[81]	valid_0's auc: 0.773289
[82]	valid_0's auc: 0.773413
[83]	valid_0's auc: 0.773471
[84]	valid_0's auc: 0.773488
[85]	valid_0's auc: 0.773609
[86]	valid_0's auc: 0.773539
[87]	valid_0's auc: 0.773566
[88]	valid_0's auc: 0.773577
[89]	valid_0's auc: 0.773545
[90]	valid_0's auc: 0.773541
[91]	valid_0's auc: 0.773535
[92]	valid_0's auc: 0.773512
[93]	valid_0's auc: 0.77355
[94]	valid_0's auc: 0.773548
[95]	valid_0's auc: 0.774004
[96]	valid_0's auc: 0.773721
[97]	valid_0's auc: 0.773677
[98]	valid_0's auc: 0.774199
[99]	valid_0's auc: 0.774101
[100]	valid_0's auc: 0.7746
[1]	valid_0's auc: 0.754636
[2]	valid_0's auc: 0.7683
[3]	valid_0's auc: 0.76755
[4]	valid_0's auc: 0.771673
[5]	valid_0's auc: 0.77165
[6]	valid_0's auc: 0.772091
[7]	valid_0's auc: 0.772091
[8]	valid_0's auc: 0.772132
[9]	valid_0's auc: 0.771605
[10]	valid_0's auc: 0.770775
[11]	valid_0's auc: 0.771277
[12]	valid_0's auc: 0.770707
[13]	valid_0's auc: 0.771283
[14]	valid_0's auc: 0.771711
[15]	valid_0's auc: 0.77119
[16]	valid_0's auc: 0.771633
[17]	valid_0's auc: 0.771584
[18]	valid_0's auc: 0.771588
[19]	valid_0's auc: 0.771543
[20]	valid_0's auc: 0.771543
[21]	valid_0's auc: 0.771886
[22]	valid_0's auc: 0.7715
[23]	valid_0's auc: 0.771517
[24]	valid_0's auc: 0.770857
[25]	valid_0's auc: 0.771035
[26]	valid_0's auc: 0.771912
[27]	valid_0's auc: 0.771809
[28]	valid_0's auc: 0.77258
[29]	valid_0's auc: 0.772955
[30]	valid_0's auc: 0.772971
[31]	valid_0's auc: 0.772641
[32]	valid_0's auc: 0.772611
[33]	valid_0's auc: 0.774166
[34]	valid_0's auc: 0.774213
[35]	valid_0's auc: 0.77425
[36]	valid_0's auc: 0.773347
[37]	valid_0's auc: 0.772993
[38]	valid_0's auc: 0.772906
[39]	valid_0's auc: 0.772932
[40]	valid_0's auc: 0.773008
[41]	valid_0's auc: 0.772858
[42]	valid_0's auc: 0.773067
[43]	valid_0's auc: 0.773347
[44]	valid_0's auc: 0.773352
[45]	valid_0's auc: 0.773312
[46]	valid_0's auc: 0.773936
[47]	valid_0's auc: 0.773849
[48]	valid_0's auc: 0.773299
[49]	valid_0's auc: 0.773392
[50]	valid_0's auc: 0.77455
[51]	valid_0's auc: 0.774309
[52]	valid_0's auc: 0.774456
[53]	valid_0's auc: 0.774478
[54]	valid_0's auc: 0.774523
[55]	valid_0's auc: 0.774329
[56]	valid_0's auc: 0.774536
[57]	valid_0's auc: 0.774473
[58]	valid_0's auc: 0.774467
[59]	valid_0's auc: 0.774537
[60]	valid_0's auc: 0.774542
[61]	valid_0's auc: 0.774563
[62]	valid_0's auc: 0.77455
[63]	valid_0's auc: 0.77465
[64]	valid_0's auc: 0.774631
[65]	valid_0's auc: 0.774766
[66]	valid_0's auc: 0.774779
[67]	valid_0's auc: 0.774751
[68]	valid_0's auc: 0.774863
[69]	valid_0's auc: 0.774616
[70]	valid_0's auc: 0.774581
[71]	valid_0's auc: 0.774516
[72]	valid_0's auc: 0.774723
[73]	valid_0's auc: 0.77471
[74]	valid_0's auc: 0.774703
[75]	valid_0's auc: 0.7747
[76]	valid_0's auc: 0.774569
[77]	valid_0's auc: 0.77472
[78]	valid_0's auc: 0.77478
[79]	valid_0's auc: 0.77467
[80]	valid_0's auc: 0.774905
[81]	valid_0's auc: 0.774915
[82]	valid_0's auc: 0.774959
[83]	valid_0's auc: 0.774998
[84]	valid_0's auc: 0.775001
[85]	valid_0's auc: 0.775307
[86]	valid_0's auc: 0.775288
[87]	valid_0's auc: 0.775306
[88]	valid_0's auc: 0.775257
[89]	valid_0's auc: 0.775222
[90]	valid_0's auc: 0.775235
[91]	valid_0's auc: 0.775213
[92]	valid_0's auc: 0.775292
[93]	valid_0's auc: 0.775469
[94]	valid_0's auc: 0.775466
[95]	valid_0's auc: 0.775585
[96]	valid_0's auc: 0.775567
[97]	valid_0's auc: 0.775574
[98]	valid_0's auc: 0.77562
[99]	valid_0's auc: 0.775569
[100]	valid_0's auc: 0.775689
[1]	valid_0's auc: 0.757698
[2]	valid_0's auc: 0.757698
[3]	valid_0's auc: 0.758663
[4]	valid_0's auc: 0.757968
[5]	valid_0's auc: 0.758662
[6]	valid_0's auc: 0.75798
[7]	valid_0's auc: 0.75798
[8]	valid_0's auc: 0.758104
[9]	valid_0's auc: 0.757959
[10]	valid_0's auc: 0.758427
[11]	valid_0's auc: 0.758427
[12]	valid_0's auc: 0.758427
[13]	valid_0's auc: 0.758433
[14]	valid_0's auc: 0.758325
[15]	valid_0's auc: 0.758354
[16]	valid_0's auc: 0.760319
[17]	valid_0's auc: 0.760263
[18]	valid_0's auc: 0.760263
[19]	valid_0's auc: 0.760263
[20]	valid_0's auc: 0.760263
[21]	valid_0's auc: 0.760255
[22]	valid_0's auc: 0.760736
[23]	valid_0's auc: 0.760654
[24]	valid_0's auc: 0.760664
[25]	valid_0's auc: 0.766487
[26]	valid_0's auc: 0.766489
[27]	valid_0's auc: 0.766417
[28]	valid_0's auc: 0.766308
[29]	valid_0's auc: 0.766493
[30]	valid_0's auc: 0.76637
[31]	valid_0's auc: 0.766556
[32]	valid_0's auc: 0.766554
[33]	valid_0's auc: 0.766802
[34]	valid_0's auc: 0.766753
[35]	valid_0's auc: 0.766767
[36]	valid_0's auc: 0.76748
[37]	valid_0's auc: 0.767074
[38]	valid_0's auc: 0.768172
[39]	valid_0's auc: 0.767759
[40]	valid_0's auc: 0.768357
[41]	valid_0's auc: 0.768318
[42]	valid_0's auc: 0.768544
[43]	valid_0's auc: 0.768559
[44]	valid_0's auc: 0.768601
[45]	valid_0's auc: 0.768854
[46]	valid_0's auc: 0.76941
[47]	valid_0's auc: 0.769373
[48]	valid_0's auc: 0.769155
[49]	valid_0's auc: 0.769244
[50]	valid_0's auc: 0.769526
[51]	valid_0's auc: 0.76957
[52]	valid_0's auc: 0.769874
[53]	valid_0's auc: 0.769973
[54]	valid_0's auc: 0.769784
[55]	valid_0's auc: 0.769842
[56]	valid_0's auc: 0.770137
[57]	valid_0's auc: 0.770307
[58]	valid_0's auc: 0.770308
[59]	valid_0's auc: 0.770285
[60]	valid_0's auc: 0.769929
[61]	valid_0's auc: 0.770009
[62]	valid_0's auc: 0.770198
[63]	valid_0's auc: 0.770596
[64]	valid_0's auc: 0.770676
[65]	valid_0's auc: 0.770492
[66]	valid_0's auc: 0.770493
[67]	valid_0's auc: 0.770469
[68]	valid_0's auc: 0.770274
[69]	valid_0's auc: 0.770311
[70]	valid_0's auc: 0.770581
[71]	valid_0's auc: 0.770507
[72]	valid_0's auc: 0.770393
[73]	valid_0's auc: 0.770362
[74]	valid_0's auc: 0.770346
[75]	valid_0's auc: 0.770352
[76]	valid_0's auc: 0.770722
[77]	valid_0's auc: 0.770691
[78]	valid_0's auc: 0.770881
[79]	valid_0's auc: 0.770734
[80]	valid_0's auc: 0.770691
[81]	valid_0's auc: 0.77073
[82]	valid_0's auc: 0.770836
[83]	valid_0's auc: 0.770829
[84]	valid_0's auc: 0.771244
[85]	valid_0's auc: 0.771144
[86]	valid_0's auc: 0.771156
[87]	valid_0's auc: 0.771087
[88]	valid_0's auc: 0.771085
[89]	valid_0's auc: 0.771283
[90]	valid_0's auc: 0.771214
[91]	valid_0's auc: 0.771206
[92]	valid_0's auc: 0.771251
[93]	valid_0's auc: 0.77113
[94]	valid_0's auc: 0.771272
[95]	valid_0's auc: 0.770991
[96]	valid_0's auc: 0.771477
[97]	valid_0's auc: 0.771409
[98]	valid_0's auc: 0.771384
[99]	valid_0's auc: 0.771218
[100]	valid_0's auc: 0.771661
[1]	valid_0's auc: 0.760154
[2]	valid_0's auc: 0.760161
[3]	valid_0's auc: 0.760248
[4]	valid_0's auc: 0.760916
[5]	valid_0's auc: 0.760916
[6]	valid_0's auc: 0.761004
[7]	valid_0's auc: 0.761004
[8]	valid_0's auc: 0.761004
[9]	valid_0's auc: 0.761194
[10]	valid_0's auc: 0.761286
[11]	valid_0's auc: 0.761286
[12]	valid_0's auc: 0.761286
[13]	valid_0's auc: 0.761198
[14]	valid_0's auc: 0.761275
[15]	valid_0's auc: 0.761194
[16]	valid_0's auc: 0.761278
[17]	valid_0's auc: 0.761268
[18]	valid_0's auc: 0.761268
[19]	valid_0's auc: 0.761268
[20]	valid_0's auc: 0.761268
[21]	valid_0's auc: 0.761268
[22]	valid_0's auc: 0.761268
[23]	valid_0's auc: 0.761265
[24]	valid_0's auc: 0.76137
[25]	valid_0's auc: 0.76179
[26]	valid_0's auc: 0.76182
[27]	valid_0's auc: 0.761936
[28]	valid_0's auc: 0.761923
[29]	valid_0's auc: 0.761952
[30]	valid_0's auc: 0.761943
[31]	valid_0's auc: 0.761915
[32]	valid_0's auc: 0.761915
[33]	valid_0's auc: 0.76199
[34]	valid_0's auc: 0.76199
[35]	valid_0's auc: 0.76199
[36]	valid_0's auc: 0.761954
[37]	valid_0's auc: 0.763266
[38]	valid_0's auc: 0.763244
[39]	valid_0's auc: 0.763254
[40]	valid_0's auc: 0.763929
[41]	valid_0's auc: 0.763875
[42]	valid_0's auc: 0.763886
[43]	valid_0's auc: 0.764064
[44]	valid_0's auc: 0.764485
[45]	valid_0's auc: 0.764143
[46]	valid_0's auc: 0.764466
[47]	valid_0's auc: 0.764466
[48]	valid_0's auc: 0.764681
[49]	valid_0's auc: 0.764643
[50]	valid_0's auc: 0.764705
[51]	valid_0's auc: 0.764722
[52]	valid_0's auc: 0.767087
[53]	valid_0's auc: 0.767179
[54]	valid_0's auc: 0.767144
[55]	valid_0's auc: 0.767281
[56]	valid_0's auc: 0.767512
[57]	valid_0's auc: 0.767443
[58]	valid_0's auc: 0.767491
[59]	valid_0's auc: 0.76749
[60]	valid_0's auc: 0.767484
[61]	valid_0's auc: 0.76747
[62]	valid_0's auc: 0.767511
[63]	valid_0's auc: 0.767614
[64]	valid_0's auc: 0.767631
[65]	valid_0's auc: 0.767783
[66]	valid_0's auc: 0.767704
[67]	valid_0's auc: 0.767651
[68]	valid_0's auc: 0.767657
[69]	valid_0's auc: 0.767636
[70]	valid_0's auc: 0.767677
[71]	valid_0's auc: 0.767687
[72]	valid_0's auc: 0.767681
[73]	valid_0's auc: 0.767664
[74]	valid_0's auc: 0.767656
[75]	valid_0's auc: 0.76765
[76]	valid_0's auc: 0.767736
[77]	valid_0's auc: 0.767909
[78]	valid_0's auc: 0.768071
[79]	valid_0's auc: 0.768152
[80]	valid_0's auc: 0.768076
[81]	valid_0's auc: 0.768155
[82]	valid_0's auc: 0.768269
[83]	valid_0's auc: 0.768283
[84]	valid_0's auc: 0.76829
[85]	valid_0's auc: 0.768314
[86]	valid_0's auc: 0.768297
[87]	valid_0's auc: 0.768305
[88]	valid_0's auc: 0.768307
[89]	valid_0's auc: 0.76831
[90]	valid_0's auc: 0.768319
[91]	valid_0's auc: 0.76833
[92]	valid_0's auc: 0.768383
[93]	valid_0's auc: 0.76835
[94]	valid_0's auc: 0.768348
[95]	valid_0's auc: 0.768241
[96]	valid_0's auc: 0.768246
[97]	valid_0's auc: 0.768247
[98]	valid_0's auc: 0.768846
[99]	valid_0's auc: 0.76882
[100]	valid_0's auc: 0.768802
[1]	valid_0's auc: 0.759786
[2]	valid_0's auc: 0.764657
[3]	valid_0's auc: 0.764758
[4]	valid_0's auc: 0.764375
[5]	valid_0's auc: 0.764375
[6]	valid_0's auc: 0.766985
[7]	valid_0's auc: 0.766985
[8]	valid_0's auc: 0.767017
[9]	valid_0's auc: 0.767087
[10]	valid_0's auc: 0.767124
[11]	valid_0's auc: 0.767094
[12]	valid_0's auc: 0.767124
[13]	valid_0's auc: 0.768049
[14]	valid_0's auc: 0.768137
[15]	valid_0's auc: 0.768011
[16]	valid_0's auc: 0.768724
[17]	valid_0's auc: 0.768589
[18]	valid_0's auc: 0.768595
[19]	valid_0's auc: 0.768595
[20]	valid_0's auc: 0.768595
[21]	valid_0's auc: 0.768619
[22]	valid_0's auc: 0.768595
[23]	valid_0's auc: 0.768595
[24]	valid_0's auc: 0.768723
[25]	valid_0's auc: 0.768513
[26]	valid_0's auc: 0.768403
[27]	valid_0's auc: 0.768035
[28]	valid_0's auc: 0.768247
[29]	valid_0's auc: 0.768126
[30]	valid_0's auc: 0.768128
[31]	valid_0's auc: 0.768196
[32]	valid_0's auc: 0.768173
[33]	valid_0's auc: 0.768436
[34]	valid_0's auc: 0.76823
[35]	valid_0's auc: 0.768254
[36]	valid_0's auc: 0.768215
[37]	valid_0's auc: 0.768752
[38]	valid_0's auc: 0.769168
[39]	valid_0's auc: 0.769103
[40]	valid_0's auc: 0.769495
[41]	valid_0's auc: 0.769406
[42]	valid_0's auc: 0.769375
[43]	valid_0's auc: 0.769349
[44]	valid_0's auc: 0.769347
[45]	valid_0's auc: 0.769434
[46]	valid_0's auc: 0.769499
[47]	valid_0's auc: 0.769453
[48]	valid_0's auc: 0.769494
[49]	valid_0's auc: 0.769587
[50]	valid_0's auc: 0.769778
[51]	valid_0's auc: 0.769701
[52]	valid_0's auc: 0.769953
[53]	valid_0's auc: 0.769821
[54]	valid_0's auc: 0.770096
[55]	valid_0's auc: 0.77001
[56]	valid_0's auc: 0.77088
[57]	valid_0's auc: 0.772296
[58]	valid_0's auc: 0.772263
[59]	valid_0's auc: 0.772162
[60]	valid_0's auc: 0.77219
[61]	valid_0's auc: 0.772107
[62]	valid_0's auc: 0.772749
[63]	valid_0's auc: 0.772469
[64]	valid_0's auc: 0.772355
[65]	valid_0's auc: 0.773286
[66]	valid_0's auc: 0.773342
[67]	valid_0's auc: 0.773373
[68]	valid_0's auc: 0.77344
[69]	valid_0's auc: 0.773408
[70]	valid_0's auc: 0.773717
[71]	valid_0's auc: 0.773612
[72]	valid_0's auc: 0.773527
[73]	valid_0's auc: 0.773558
[74]	valid_0's auc: 0.773404
[75]	valid_0's auc: 0.773545
[76]	valid_0's auc: 0.773719
[77]	valid_0's auc: 0.773939
[78]	valid_0's auc: 0.773657
[79]	valid_0's auc: 0.774132
[80]	valid_0's auc: 0.774251
[81]	valid_0's auc: 0.775493
[82]	valid_0's auc: 0.775668
[83]	valid_0's auc: 0.775637
[84]	valid_0's auc: 0.776247
[85]	valid_0's auc: 0.776914
[86]	valid_0's auc: 0.776719
[87]	valid_0's auc: 0.776801
[88]	valid_0's auc: 0.776778
[89]	valid_0's auc: 0.77678
[90]	valid_0's auc: 0.776568
[91]	valid_0's auc: 0.776637
[92]	valid_0's auc: 0.77673
[93]	valid_0's auc: 0.776782
[94]	valid_0's auc: 0.776763
[95]	valid_0's auc: 0.776884
[96]	valid_0's auc: 0.77733
[97]	valid_0's auc: 0.777335
[98]	valid_0's auc: 0.777751
[99]	valid_0's auc: 0.777748
[100]	valid_0's auc: 0.778022
[1]	valid_0's auc: 0.762215
[2]	valid_0's auc: 0.772303
[3]	valid_0's auc: 0.772421
[4]	valid_0's auc: 0.776396
[5]	valid_0's auc: 0.776396
[6]	valid_0's auc: 0.776034
[7]	valid_0's auc: 0.776034
[8]	valid_0's auc: 0.776073
[9]	valid_0's auc: 0.775492
[10]	valid_0's auc: 0.775158
[11]	valid_0's auc: 0.775186
[12]	valid_0's auc: 0.775138
[13]	valid_0's auc: 0.775428
[14]	valid_0's auc: 0.776102
[15]	valid_0's auc: 0.776007
[16]	valid_0's auc: 0.776659
[17]	valid_0's auc: 0.776328
[18]	valid_0's auc: 0.776428
[19]	valid_0's auc: 0.776702
[20]	valid_0's auc: 0.776702
[21]	valid_0's auc: 0.776649
[22]	valid_0's auc: 0.776457
[23]	valid_0's auc: 0.776534
[24]	valid_0's auc: 0.77625
[25]	valid_0's auc: 0.776649
[26]	valid_0's auc: 0.776991
[27]	valid_0's auc: 0.77737
[28]	valid_0's auc: 0.778308
[29]	valid_0's auc: 0.778112
[30]	valid_0's auc: 0.778101
[31]	valid_0's auc: 0.778954
[32]	valid_0's auc: 0.778571
[33]	valid_0's auc: 0.778402
[34]	valid_0's auc: 0.778444
[35]	valid_0's auc: 0.778519
[36]	valid_0's auc: 0.779053
[37]	valid_0's auc: 0.778855
[38]	valid_0's auc: 0.779286
[39]	valid_0's auc: 0.77918
[40]	valid_0's auc: 0.779893
[41]	valid_0's auc: 0.77988
[42]	valid_0's auc: 0.779792
[43]	valid_0's auc: 0.77996
[44]	valid_0's auc: 0.779959
[45]	valid_0's auc: 0.779746
[46]	valid_0's auc: 0.780106
[47]	valid_0's auc: 0.779928
[48]	valid_0's auc: 0.779702
[49]	valid_0's auc: 0.780486
[50]	valid_0's auc: 0.780543
[51]	valid_0's auc: 0.780377
[52]	valid_0's auc: 0.780245
[53]	valid_0's auc: 0.780296
[54]	valid_0's auc: 0.780477
[55]	valid_0's auc: 0.780359
[56]	valid_0's auc: 0.780587
[57]	valid_0's auc: 0.780592
[58]	valid_0's auc: 0.780519
[59]	valid_0's auc: 0.78048
[60]	valid_0's auc: 0.78066
[61]	valid_0's auc: 0.7806
[62]	valid_0's auc: 0.780529
[63]	valid_0's auc: 0.780512
[64]	valid_0's auc: 0.78051
[65]	valid_0's auc: 0.780494
[66]	valid_0's auc: 0.780505
[67]	valid_0's auc: 0.780639
[68]	valid_0's auc: 0.780643
[69]	valid_0's auc: 0.781405
[70]	valid_0's auc: 0.781379
[71]	valid_0's auc: 0.781365
[72]	valid_0's auc: 0.78147
[73]	valid_0's auc: 0.781439
[74]	valid_0's auc: 0.781477
[75]	valid_0's auc: 0.78146
[76]	valid_0's auc: 0.781614
[77]	valid_0's auc: 0.78155
[78]	valid_0's auc: 0.781541
[79]	valid_0's auc: 0.781483
[80]	valid_0's auc: 0.781614
[81]	valid_0's auc: 0.781545
[82]	valid_0's auc: 0.781525
[83]	valid_0's auc: 0.781526
[84]	valid_0's auc: 0.781457
[85]	valid_0's auc: 0.781331
[86]	valid_0's auc: 0.781399
[87]	valid_0's auc: 0.781426
[88]	valid_0's auc: 0.781474
[89]	valid_0's auc: 0.781477
[90]	valid_0's auc: 0.781462
[91]	valid_0's auc: 0.781498
[92]	valid_0's auc: 0.781466
[93]	valid_0's auc: 0.781557
[94]	valid_0's auc: 0.781488
[95]	valid_0's auc: 0.781433
[96]	valid_0's auc: 0.781656
[97]	valid_0's auc: 0.781655
[98]	valid_0's auc: 0.781619
[99]	valid_0's auc: 0.781606
[100]	valid_0's auc: 0.781655
[1]	valid_0's auc: 0.760477
[2]	valid_0's auc: 0.760477
[3]	valid_0's auc: 0.760477
[4]	valid_0's auc: 0.760477
[5]	valid_0's auc: 0.760477
[6]	valid_0's auc: 0.761103
[7]	valid_0's auc: 0.761103
[8]	valid_0's auc: 0.76106
[9]	valid_0's auc: 0.760922
[10]	valid_0's auc: 0.76187
[11]	valid_0's auc: 0.761742
[12]	valid_0's auc: 0.761742
[13]	valid_0's auc: 0.76187
[14]	valid_0's auc: 0.761554
[15]	valid_0's auc: 0.762118
[16]	valid_0's auc: 0.761546
[17]	valid_0's auc: 0.762863
[18]	valid_0's auc: 0.762864
[19]	valid_0's auc: 0.762864
[20]	valid_0's auc: 0.762864
[21]	valid_0's auc: 0.762727
[22]	valid_0's auc: 0.762862
[23]	valid_0's auc: 0.76289
[24]	valid_0's auc: 0.763555
[25]	valid_0's auc: 0.769488
[26]	valid_0's auc: 0.76942
[27]	valid_0's auc: 0.769341
[28]	valid_0's auc: 0.769035
[29]	valid_0's auc: 0.769221
[30]	valid_0's auc: 0.769021
[31]	valid_0's auc: 0.769883
[32]	valid_0's auc: 0.769575
[33]	valid_0's auc: 0.769276
[34]	valid_0's auc: 0.769172
[35]	valid_0's auc: 0.769169
[36]	valid_0's auc: 0.768997
[37]	valid_0's auc: 0.769885
[38]	valid_0's auc: 0.770603
[39]	valid_0's auc: 0.770569
[40]	valid_0's auc: 0.770706
[41]	valid_0's auc: 0.770673
[42]	valid_0's auc: 0.770714
[43]	valid_0's auc: 0.771279
[44]	valid_0's auc: 0.771266
[45]	valid_0's auc: 0.771327
[46]	valid_0's auc: 0.771266
[47]	valid_0's auc: 0.771327
[48]	valid_0's auc: 0.771392
[49]	valid_0's auc: 0.771751
[50]	valid_0's auc: 0.771923
[51]	valid_0's auc: 0.771944
[52]	valid_0's auc: 0.772184
[53]	valid_0's auc: 0.772863
[54]	valid_0's auc: 0.772887
[55]	valid_0's auc: 0.772932
[56]	valid_0's auc: 0.7734
[57]	valid_0's auc: 0.773666
[58]	valid_0's auc: 0.773605
[59]	valid_0's auc: 0.773455
[60]	valid_0's auc: 0.773439
[61]	valid_0's auc: 0.773473
[62]	valid_0's auc: 0.773906
[63]	valid_0's auc: 0.773854
[64]	valid_0's auc: 0.773928
[65]	valid_0's auc: 0.773975
[66]	valid_0's auc: 0.773847
[67]	valid_0's auc: 0.773871
[68]	valid_0's auc: 0.773834
[69]	valid_0's auc: 0.773791
[70]	valid_0's auc: 0.773813
[71]	valid_0's auc: 0.773753
[72]	valid_0's auc: 0.774121
[73]	valid_0's auc: 0.774117
[74]	valid_0's auc: 0.774116
[75]	valid_0's auc: 0.773973
[76]	valid_0's auc: 0.773985
[77]	valid_0's auc: 0.773966
[78]	valid_0's auc: 0.774442
[79]	valid_0's auc: 0.774453
[80]	valid_0's auc: 0.774368
[81]	valid_0's auc: 0.774675
[82]	valid_0's auc: 0.774685
[83]	valid_0's auc: 0.774644
[84]	valid_0's auc: 0.774938
[85]	valid_0's auc: 0.775045
[86]	valid_0's auc: 0.775054
[87]	valid_0's auc: 0.775045
[88]	valid_0's auc: 0.775089
[89]	valid_0's auc: 0.775053
[90]	valid_0's auc: 0.775075
[91]	valid_0's auc: 0.775078
[92]	valid_0's auc: 0.774998
[93]	valid_0's auc: 0.775079
[94]	valid_0's auc: 0.775069
[95]	valid_0's auc: 0.775078
[96]	valid_0's auc: 0.775271
[97]	valid_0's auc: 0.775257
[98]	valid_0's auc: 0.77529
[99]	valid_0's auc: 0.775282
[100]	valid_0's auc: 0.775197
[1]	valid_0's auc: 0.7495
[2]	valid_0's auc: 0.7495
[3]	valid_0's auc: 0.7495
[4]	valid_0's auc: 0.7495
[5]	valid_0's auc: 0.7495
[6]	valid_0's auc: 0.7495
[7]	valid_0's auc: 0.7495
[8]	valid_0's auc: 0.751338
[9]	valid_0's auc: 0.751338
[10]	valid_0's auc: 0.751078
[11]	valid_0's auc: 0.751078
[12]	valid_0's auc: 0.751338
[13]	valid_0's auc: 0.751338
[14]	valid_0's auc: 0.751338
[15]	valid_0's auc: 0.751078
[16]	valid_0's auc: 0.751078
[17]	valid_0's auc: 0.749299
[18]	valid_0's auc: 0.749299
[19]	valid_0's auc: 0.749299
[20]	valid_0's auc: 0.749299
[21]	valid_0's auc: 0.749299
[22]	valid_0's auc: 0.751078
[23]	valid_0's auc: 0.751078
[24]	valid_0's auc: 0.753588
[25]	valid_0's auc: 0.753588
[26]	valid_0's auc: 0.753588
[27]	valid_0's auc: 0.752608
[28]	valid_0's auc: 0.752634
[29]	valid_0's auc: 0.752634
[30]	valid_0's auc: 0.752634
[31]	valid_0's auc: 0.753588
[32]	valid_0's auc: 0.753588
[33]	valid_0's auc: 0.752608
[34]	valid_0's auc: 0.752608
[35]	valid_0's auc: 0.752634
[36]	valid_0's auc: 0.753663
[37]	valid_0's auc: 0.753588
[38]	valid_0's auc: 0.753817
[39]	valid_0's auc: 0.753817
[40]	valid_0's auc: 0.753801
[41]	valid_0's auc: 0.753622
[42]	valid_0's auc: 0.754569
[43]	valid_0's auc: 0.754617
[44]	valid_0's auc: 0.754617
[45]	valid_0's auc: 0.754608
[46]	valid_0's auc: 0.754558
[47]	valid_0's auc: 0.754558
[48]	valid_0's auc: 0.754558
[49]	valid_0's auc: 0.754628
[50]	valid_0's auc: 0.754558
[51]	valid_0's auc: 0.754579
[52]	valid_0's auc: 0.754617
[53]	valid_0's auc: 0.754621
[54]	valid_0's auc: 0.75465
[55]	valid_0's auc: 0.756121
[56]	valid_0's auc: 0.75612
[57]	valid_0's auc: 0.756265
[58]	valid_0's auc: 0.756265
[59]	valid_0's auc: 0.756265
[60]	valid_0's auc: 0.756283
[61]	valid_0's auc: 0.756283
[62]	valid_0's auc: 0.756245
[63]	valid_0's auc: 0.756262
[64]	valid_0's auc: 0.756262
[65]	valid_0's auc: 0.756285
[66]	valid_0's auc: 0.756247
[67]	valid_0's auc: 0.756247
[68]	valid_0's auc: 0.756285
[69]	valid_0's auc: 0.756247
[70]	valid_0's auc: 0.756247
[71]	valid_0's auc: 0.756247
[72]	valid_0's auc: 0.756247
[73]	valid_0's auc: 0.756247
[74]	valid_0's auc: 0.756247
[75]	valid_0's auc: 0.756247
[76]	valid_0's auc: 0.756246
[77]	valid_0's auc: 0.7563
[78]	valid_0's auc: 0.756268
[79]	valid_0's auc: 0.756322
[80]	valid_0's auc: 0.756322
[81]	valid_0's auc: 0.762092
[82]	valid_0's auc: 0.762097
[83]	valid_0's auc: 0.762082
[84]	valid_0's auc: 0.762117
[85]	valid_0's auc: 0.762138
[86]	valid_0's auc: 0.762142
[87]	valid_0's auc: 0.762142
[88]	valid_0's auc: 0.762142
[89]	valid_0's auc: 0.762133
[90]	valid_0's auc: 0.762127
[91]	valid_0's auc: 0.762118
[92]	valid_0's auc: 0.762166
[93]	valid_0's auc: 0.762127
[94]	valid_0's auc: 0.762127
[95]	valid_0's auc: 0.762173
[96]	valid_0's auc: 0.762131
[97]	valid_0's auc: 0.762127
[98]	valid_0's auc: 0.762461
[99]	valid_0's auc: 0.762457
[100]	valid_0's auc: 0.762457
[1]	valid_0's auc: 0.748453
[2]	valid_0's auc: 0.749691
[3]	valid_0's auc: 0.751615
[4]	valid_0's auc: 0.751094
[5]	valid_0's auc: 0.751119
[6]	valid_0's auc: 0.752006
[7]	valid_0's auc: 0.752006
[8]	valid_0's auc: 0.751982
[9]	valid_0's auc: 0.752532
[10]	valid_0's auc: 0.756575
[11]	valid_0's auc: 0.756658
[12]	valid_0's auc: 0.756658
[13]	valid_0's auc: 0.756638
[14]	valid_0's auc: 0.756577
[15]	valid_0's auc: 0.756643
[16]	valid_0's auc: 0.756819
[17]	valid_0's auc: 0.756629
[18]	valid_0's auc: 0.756785
[19]	valid_0's auc: 0.756519
[20]	valid_0's auc: 0.756519
[21]	valid_0's auc: 0.756542
[22]	valid_0's auc: 0.756415
[23]	valid_0's auc: 0.756415
[24]	valid_0's auc: 0.756995
[25]	valid_0's auc: 0.757296
[26]	valid_0's auc: 0.757296
[27]	valid_0's auc: 0.756938
[28]	valid_0's auc: 0.756638
[29]	valid_0's auc: 0.756657
[30]	valid_0's auc: 0.756656
[31]	valid_0's auc: 0.756939
[32]	valid_0's auc: 0.756922
[33]	valid_0's auc: 0.756658
[34]	valid_0's auc: 0.756658
[35]	valid_0's auc: 0.75676
[36]	valid_0's auc: 0.756836
[37]	valid_0's auc: 0.758193
[38]	valid_0's auc: 0.761837
[39]	valid_0's auc: 0.761771
[40]	valid_0's auc: 0.76182
[41]	valid_0's auc: 0.761735
[42]	valid_0's auc: 0.762922
[43]	valid_0's auc: 0.76256
[44]	valid_0's auc: 0.762561
[45]	valid_0's auc: 0.762974
[46]	valid_0's auc: 0.763946
[47]	valid_0's auc: 0.763934
[48]	valid_0's auc: 0.765772
[49]	valid_0's auc: 0.766153
[50]	valid_0's auc: 0.765867
[51]	valid_0's auc: 0.766182
[52]	valid_0's auc: 0.766029
[53]	valid_0's auc: 0.766586
[54]	valid_0's auc: 0.766755
[55]	valid_0's auc: 0.766879
[56]	valid_0's auc: 0.766713
[57]	valid_0's auc: 0.76723
[58]	valid_0's auc: 0.767224
[59]	valid_0's auc: 0.767274
[60]	valid_0's auc: 0.76708
[61]	valid_0's auc: 0.767084
[62]	valid_0's auc: 0.767316
[63]	valid_0's auc: 0.768125
[64]	valid_0's auc: 0.768149
[65]	valid_0's auc: 0.768159
[66]	valid_0's auc: 0.768155
[67]	valid_0's auc: 0.767995
[68]	valid_0's auc: 0.767966
[69]	valid_0's auc: 0.768898
[70]	valid_0's auc: 0.768752
[71]	valid_0's auc: 0.768493
[72]	valid_0's auc: 0.768935
[73]	valid_0's auc: 0.768953
[74]	valid_0's auc: 0.76896
[75]	valid_0's auc: 0.768928
[76]	valid_0's auc: 0.769088
[77]	valid_0's auc: 0.769044
[78]	valid_0's auc: 0.769472
[79]	valid_0's auc: 0.769704
[80]	valid_0's auc: 0.770133
[81]	valid_0's auc: 0.770279
[82]	valid_0's auc: 0.77036
[83]	valid_0's auc: 0.770189
[84]	valid_0's auc: 0.770399
[85]	valid_0's auc: 0.770458
[86]	valid_0's auc: 0.770468
[87]	valid_0's auc: 0.770465
[88]	valid_0's auc: 0.770445
[89]	valid_0's auc: 0.770423
[90]	valid_0's auc: 0.770407
[91]	valid_0's auc: 0.770374
[92]	valid_0's auc: 0.770488
[93]	valid_0's auc: 0.770414
[94]	valid_0's auc: 0.770352
[95]	valid_0's auc: 0.770421
[96]	valid_0's auc: 0.770474
[97]	valid_0's auc: 0.770517
[98]	valid_0's auc: 0.770554
[99]	valid_0's auc: 0.770537
[100]	valid_0's auc: 0.770578
[1]	valid_0's auc: 0.748867
[2]	valid_0's auc: 0.762521
[3]	valid_0's auc: 0.764062
[4]	valid_0's auc: 0.763258
[5]	valid_0's auc: 0.763258
[6]	valid_0's auc: 0.762373
[7]	valid_0's auc: 0.762373
[8]	valid_0's auc: 0.762343
[9]	valid_0's auc: 0.764046
[10]	valid_0's auc: 0.763356
[11]	valid_0's auc: 0.763356
[12]	valid_0's auc: 0.763956
[13]	valid_0's auc: 0.764331
[14]	valid_0's auc: 0.763847
[15]	valid_0's auc: 0.763082
[16]	valid_0's auc: 0.762994
[17]	valid_0's auc: 0.763904
[18]	valid_0's auc: 0.764023
[19]	valid_0's auc: 0.764023
[20]	valid_0's auc: 0.764023
[21]	valid_0's auc: 0.763919
[22]	valid_0's auc: 0.76395
[23]	valid_0's auc: 0.763922
[24]	valid_0's auc: 0.76392
[25]	valid_0's auc: 0.763752
[26]	valid_0's auc: 0.764454
[27]	valid_0's auc: 0.76508
[28]	valid_0's auc: 0.764877
[29]	valid_0's auc: 0.764979
[30]	valid_0's auc: 0.764998
[31]	valid_0's auc: 0.764434
[32]	valid_0's auc: 0.764837
[33]	valid_0's auc: 0.765323
[34]	valid_0's auc: 0.76526
[35]	valid_0's auc: 0.765255
[36]	valid_0's auc: 0.765941
[37]	valid_0's auc: 0.765249
[38]	valid_0's auc: 0.766137
[39]	valid_0's auc: 0.766162
[40]	valid_0's auc: 0.765359
[41]	valid_0's auc: 0.76535
[42]	valid_0's auc: 0.765736
[43]	valid_0's auc: 0.766502
[44]	valid_0's auc: 0.766498
[45]	valid_0's auc: 0.76613
[46]	valid_0's auc: 0.766363
[47]	valid_0's auc: 0.766347
[48]	valid_0's auc: 0.766244
[49]	valid_0's auc: 0.766738
[50]	valid_0's auc: 0.766597
[51]	valid_0's auc: 0.767732
[52]	valid_0's auc: 0.767454
[53]	valid_0's auc: 0.767521
[54]	valid_0's auc: 0.767431
[55]	valid_0's auc: 0.767401
[56]	valid_0's auc: 0.768159
[57]	valid_0's auc: 0.767965
[58]	valid_0's auc: 0.768016
[59]	valid_0's auc: 0.767889
[60]	valid_0's auc: 0.767804
[61]	valid_0's auc: 0.767866
[62]	valid_0's auc: 0.768152
[63]	valid_0's auc: 0.767838
[64]	valid_0's auc: 0.767946
[65]	valid_0's auc: 0.767746
[66]	valid_0's auc: 0.767741
[67]	valid_0's auc: 0.767787
[68]	valid_0's auc: 0.767848
[69]	valid_0's auc: 0.767646
[70]	valid_0's auc: 0.768215
[71]	valid_0's auc: 0.768195
[72]	valid_0's auc: 0.768186
[73]	valid_0's auc: 0.768187
[74]	valid_0's auc: 0.768195
[75]	valid_0's auc: 0.768135
[76]	valid_0's auc: 0.768456
[77]	valid_0's auc: 0.768422
[78]	valid_0's auc: 0.768505
[79]	valid_0's auc: 0.768409
[80]	valid_0's auc: 0.768388
[81]	valid_0's auc: 0.768728
[82]	valid_0's auc: 0.768667
[83]	valid_0's auc: 0.76866
[84]	valid_0's auc: 0.768468
[85]	valid_0's auc: 0.768719
[86]	valid_0's auc: 0.768696
[87]	valid_0's auc: 0.768676
[88]	valid_0's auc: 0.768684
[89]	valid_0's auc: 0.768693
[90]	valid_0's auc: 0.768666
[91]	valid_0's auc: 0.76873
[92]	valid_0's auc: 0.768665
[93]	valid_0's auc: 0.76916
[94]	valid_0's auc: 0.769141
[95]	valid_0's auc: 0.769077
[96]	valid_0's auc: 0.768722
[97]	valid_0's auc: 0.768731
[98]	valid_0's auc: 0.769051
[99]	valid_0's auc: 0.769084
[100]	valid_0's auc: 0.768879
[1]	valid_0's auc: 0.748286
[2]	valid_0's auc: 0.748286
[3]	valid_0's auc: 0.749259
[4]	valid_0's auc: 0.749362
[5]	valid_0's auc: 0.749362
[6]	valid_0's auc: 0.749362
[7]	valid_0's auc: 0.749362
[8]	valid_0's auc: 0.749483
[9]	valid_0's auc: 0.74936
[10]	valid_0's auc: 0.749361
[11]	valid_0's auc: 0.749362
[12]	valid_0's auc: 0.749509
[13]	valid_0's auc: 0.757395
[14]	valid_0's auc: 0.757131
[15]	valid_0's auc: 0.757192
[16]	valid_0's auc: 0.757179
[17]	valid_0's auc: 0.759842
[18]	valid_0's auc: 0.759842
[19]	valid_0's auc: 0.759842
[20]	valid_0's auc: 0.759842
[21]	valid_0's auc: 0.759829
[22]	valid_0's auc: 0.759813
[23]	valid_0's auc: 0.759844
[24]	valid_0's auc: 0.759267
[25]	valid_0's auc: 0.760772
[26]	valid_0's auc: 0.760956
[27]	valid_0's auc: 0.761568
[28]	valid_0's auc: 0.762757
[29]	valid_0's auc: 0.762609
[30]	valid_0's auc: 0.762378
[31]	valid_0's auc: 0.762651
[32]	valid_0's auc: 0.762598
[33]	valid_0's auc: 0.76328
[34]	valid_0's auc: 0.7635
[35]	valid_0's auc: 0.763434
[36]	valid_0's auc: 0.763371
[37]	valid_0's auc: 0.763524
[38]	valid_0's auc: 0.763325
[39]	valid_0's auc: 0.76332
[40]	valid_0's auc: 0.763144
[41]	valid_0's auc: 0.763069
[42]	valid_0's auc: 0.763788
[43]	valid_0's auc: 0.764103
[44]	valid_0's auc: 0.764047
[45]	valid_0's auc: 0.764262
[46]	valid_0's auc: 0.764678
[47]	valid_0's auc: 0.764708
[48]	valid_0's auc: 0.764593
[49]	valid_0's auc: 0.765004
[50]	valid_0's auc: 0.764929
[51]	valid_0's auc: 0.765181
[52]	valid_0's auc: 0.766236
[53]	valid_0's auc: 0.766075
[54]	valid_0's auc: 0.765311
[55]	valid_0's auc: 0.766189
[56]	valid_0's auc: 0.76613
[57]	valid_0's auc: 0.766162
[58]	valid_0's auc: 0.766052
[59]	valid_0's auc: 0.766053
[60]	valid_0's auc: 0.766352
[61]	valid_0's auc: 0.766448
[62]	valid_0's auc: 0.766314
[63]	valid_0's auc: 0.76624
[64]	valid_0's auc: 0.766245
[65]	valid_0's auc: 0.766094
[66]	valid_0's auc: 0.766201
[67]	valid_0's auc: 0.766238
[68]	valid_0's auc: 0.766368
[69]	valid_0's auc: 0.766113
[70]	valid_0's auc: 0.766534
[71]	valid_0's auc: 0.766537
[72]	valid_0's auc: 0.766421
[73]	valid_0's auc: 0.766394
[74]	valid_0's auc: 0.766421
[75]	valid_0's auc: 0.76638
[76]	valid_0's auc: 0.766329
[77]	valid_0's auc: 0.766692
[78]	valid_0's auc: 0.766606
[79]	valid_0's auc: 0.766631
[80]	valid_0's auc: 0.766506
[81]	valid_0's auc: 0.766782
[82]	valid_0's auc: 0.766681
[83]	valid_0's auc: 0.766804
[84]	valid_0's auc: 0.766541
[85]	valid_0's auc: 0.767484
[86]	valid_0's auc: 0.767418
[87]	valid_0's auc: 0.767501
[88]	valid_0's auc: 0.767382
[89]	valid_0's auc: 0.767382
[90]	valid_0's auc: 0.767385
[91]	valid_0's auc: 0.767523
[92]	valid_0's auc: 0.767353
[93]	valid_0's auc: 0.76712
[94]	valid_0's auc: 0.767102
[95]	valid_0's auc: 0.767498
[96]	valid_0's auc: 0.76749
[97]	valid_0's auc: 0.767409
[98]	valid_0's auc: 0.767268
[99]	valid_0's auc: 0.767245
[100]	valid_0's auc: 0.76742
[1]	valid_0's auc: 0.757043
[2]	valid_0's auc: 0.757043
[3]	valid_0's auc: 0.757043
[4]	valid_0's auc: 0.757043
[5]	valid_0's auc: 0.757043
[6]	valid_0's auc: 0.757043
[7]	valid_0's auc: 0.757043
[8]	valid_0's auc: 0.757043
[9]	valid_0's auc: 0.757043
[10]	valid_0's auc: 0.757043
[11]	valid_0's auc: 0.757043
[12]	valid_0's auc: 0.757043
[13]	valid_0's auc: 0.757043
[14]	valid_0's auc: 0.757043
[15]	valid_0's auc: 0.757043
[16]	valid_0's auc: 0.757043
[17]	valid_0's auc: 0.757043
[18]	valid_0's auc: 0.757043
[19]	valid_0's auc: 0.757043
[20]	valid_0's auc: 0.757043
[21]	valid_0's auc: 0.757043
[22]	valid_0's auc: 0.757043
[23]	valid_0's auc: 0.757043
[24]	valid_0's auc: 0.758625
[25]	valid_0's auc: 0.759423
[26]	valid_0's auc: 0.760099
[27]	valid_0's auc: 0.760372
[28]	valid_0's auc: 0.760721
[29]	valid_0's auc: 0.760723
[30]	valid_0's auc: 0.760721
[31]	valid_0's auc: 0.760638
[32]	valid_0's auc: 0.760638
[33]	valid_0's auc: 0.760787
[34]	valid_0's auc: 0.760774
[35]	valid_0's auc: 0.760767
[36]	valid_0's auc: 0.760672
[37]	valid_0's auc: 0.760837
[38]	valid_0's auc: 0.760877
[39]	valid_0's auc: 0.760878
[40]	valid_0's auc: 0.761056
[41]	valid_0's auc: 0.761056
[42]	valid_0's auc: 0.760948
[43]	valid_0's auc: 0.761049
[44]	valid_0's auc: 0.761041
[45]	valid_0's auc: 0.760916
[46]	valid_0's auc: 0.760889
[47]	valid_0's auc: 0.760899
[48]	valid_0's auc: 0.760859
[49]	valid_0's auc: 0.760945
[50]	valid_0's auc: 0.760828
[51]	valid_0's auc: 0.760851
[52]	valid_0's auc: 0.761384
[53]	valid_0's auc: 0.761398
[54]	valid_0's auc: 0.761498
[55]	valid_0's auc: 0.761464
[56]	valid_0's auc: 0.761569
[57]	valid_0's auc: 0.761727
[58]	valid_0's auc: 0.761733
[59]	valid_0's auc: 0.761743
[60]	valid_0's auc: 0.761746
[61]	valid_0's auc: 0.761744
[62]	valid_0's auc: 0.76204
[63]	valid_0's auc: 0.762099
[64]	valid_0's auc: 0.762097
[65]	valid_0's auc: 0.7623
[66]	valid_0's auc: 0.762314
[67]	valid_0's auc: 0.762316
[68]	valid_0's auc: 0.762281
[69]	valid_0's auc: 0.762411
[70]	valid_0's auc: 0.762074
[71]	valid_0's auc: 0.762073
[72]	valid_0's auc: 0.762089
[73]	valid_0's auc: 0.762087
[74]	valid_0's auc: 0.762067
[75]	valid_0's auc: 0.762067
[76]	valid_0's auc: 0.762115
[77]	valid_0's auc: 0.76199
[78]	valid_0's auc: 0.762443
[79]	valid_0's auc: 0.762179
[80]	valid_0's auc: 0.762287
[81]	valid_0's auc: 0.762479
[82]	valid_0's auc: 0.762476
[83]	valid_0's auc: 0.762475
[84]	valid_0's auc: 0.762384
[85]	valid_0's auc: 0.762212
[86]	valid_0's auc: 0.76221
[87]	valid_0's auc: 0.76221
[88]	valid_0's auc: 0.762148
[89]	valid_0's auc: 0.762149
[90]	valid_0's auc: 0.762396
[91]	valid_0's auc: 0.762631
[92]	valid_0's auc: 0.762333
[93]	valid_0's auc: 0.762636
[94]	valid_0's auc: 0.762636
[95]	valid_0's auc: 0.762675
[96]	valid_0's auc: 0.762796
[97]	valid_0's auc: 0.762798
[98]	valid_0's auc: 0.762913
[99]	valid_0's auc: 0.762842
[100]	valid_0's auc: 0.762814
[1]	valid_0's auc: 0.752298
[2]	valid_0's auc: 0.759276
[3]	valid_0's auc: 0.759067
[4]	valid_0's auc: 0.759079
[5]	valid_0's auc: 0.759079
[6]	valid_0's auc: 0.759532
[7]	valid_0's auc: 0.759532
[8]	valid_0's auc: 0.760022
[9]	valid_0's auc: 0.760088
[10]	valid_0's auc: 0.761088
[11]	valid_0's auc: 0.76118
[12]	valid_0's auc: 0.761204
[13]	valid_0's auc: 0.761241
[14]	valid_0's auc: 0.761636
[15]	valid_0's auc: 0.762392
[16]	valid_0's auc: 0.762519
[17]	valid_0's auc: 0.762632
[18]	valid_0's auc: 0.762461
[19]	valid_0's auc: 0.76206
[20]	valid_0's auc: 0.76206
[21]	valid_0's auc: 0.762252
[22]	valid_0's auc: 0.762043
[23]	valid_0's auc: 0.762071
[24]	valid_0's auc: 0.761918
[25]	valid_0's auc: 0.762181
[26]	valid_0's auc: 0.762143
[27]	valid_0's auc: 0.762652
[28]	valid_0's auc: 0.762273
[29]	valid_0's auc: 0.762258
[30]	valid_0's auc: 0.762682
[31]	valid_0's auc: 0.762936
[32]	valid_0's auc: 0.762906
[33]	valid_0's auc: 0.762494
[34]	valid_0's auc: 0.762525
[35]	valid_0's auc: 0.763031
[36]	valid_0's auc: 0.763035
[37]	valid_0's auc: 0.764186
[38]	valid_0's auc: 0.764316
[39]	valid_0's auc: 0.764316
[40]	valid_0's auc: 0.764385
[41]	valid_0's auc: 0.764265
[42]	valid_0's auc: 0.764223
[43]	valid_0's auc: 0.764279
[44]	valid_0's auc: 0.764414
[45]	valid_0's auc: 0.764437
[46]	valid_0's auc: 0.764763
[47]	valid_0's auc: 0.764756
[48]	valid_0's auc: 0.764393
[49]	valid_0's auc: 0.764862
[50]	valid_0's auc: 0.764694
[51]	valid_0's auc: 0.764903
[52]	valid_0's auc: 0.765013
[53]	valid_0's auc: 0.765105
[54]	valid_0's auc: 0.765277
[55]	valid_0's auc: 0.766446
[56]	valid_0's auc: 0.766831
[57]	valid_0's auc: 0.769933
[58]	valid_0's auc: 0.769906
[59]	valid_0's auc: 0.769609
[60]	valid_0's auc: 0.769994
[61]	valid_0's auc: 0.769815
[62]	valid_0's auc: 0.770042
[63]	valid_0's auc: 0.771214
[64]	valid_0's auc: 0.771252
[65]	valid_0's auc: 0.771431
[66]	valid_0's auc: 0.771427
[67]	valid_0's auc: 0.771581
[68]	valid_0's auc: 0.771358
[69]	valid_0's auc: 0.771738
[70]	valid_0's auc: 0.771727
[71]	valid_0's auc: 0.771701
[72]	valid_0's auc: 0.771878
[73]	valid_0's auc: 0.771759
[74]	valid_0's auc: 0.77178
[75]	valid_0's auc: 0.771755
[76]	valid_0's auc: 0.772198
[77]	valid_0's auc: 0.772287
[78]	valid_0's auc: 0.77261
[79]	valid_0's auc: 0.77281
[80]	valid_0's auc: 0.77312
[81]	valid_0's auc: 0.773289
[82]	valid_0's auc: 0.773413
[83]	valid_0's auc: 0.773471
[84]	valid_0's auc: 0.773488
[85]	valid_0's auc: 0.773609
[86]	valid_0's auc: 0.773539
[87]	valid_0's auc: 0.773566
[88]	valid_0's auc: 0.773577
[89]	valid_0's auc: 0.773545
[90]	valid_0's auc: 0.773541
[91]	valid_0's auc: 0.773535
[92]	valid_0's auc: 0.773512
[93]	valid_0's auc: 0.77355
[94]	valid_0's auc: 0.773548
[95]	valid_0's auc: 0.774004
[96]	valid_0's auc: 0.773721
[97]	valid_0's auc: 0.773677
[98]	valid_0's auc: 0.774199
[99]	valid_0's auc: 0.774101
[100]	valid_0's auc: 0.7746
[1]	valid_0's auc: 0.754636
[2]	valid_0's auc: 0.7683
[3]	valid_0's auc: 0.76755
[4]	valid_0's auc: 0.771673
[5]	valid_0's auc: 0.77165
[6]	valid_0's auc: 0.772091
[7]	valid_0's auc: 0.772091
[8]	valid_0's auc: 0.772132
[9]	valid_0's auc: 0.771605
[10]	valid_0's auc: 0.770775
[11]	valid_0's auc: 0.771277
[12]	valid_0's auc: 0.770707
[13]	valid_0's auc: 0.771283
[14]	valid_0's auc: 0.771711
[15]	valid_0's auc: 0.77119
[16]	valid_0's auc: 0.771633
[17]	valid_0's auc: 0.771584
[18]	valid_0's auc: 0.771588
[19]	valid_0's auc: 0.771543
[20]	valid_0's auc: 0.771543
[21]	valid_0's auc: 0.771886
[22]	valid_0's auc: 0.7715
[23]	valid_0's auc: 0.771517
[24]	valid_0's auc: 0.770857
[25]	valid_0's auc: 0.771035
[26]	valid_0's auc: 0.771912
[27]	valid_0's auc: 0.771809
[28]	valid_0's auc: 0.77258
[29]	valid_0's auc: 0.772955
[30]	valid_0's auc: 0.772971
[31]	valid_0's auc: 0.772641
[32]	valid_0's auc: 0.772611
[33]	valid_0's auc: 0.774166
[34]	valid_0's auc: 0.774213
[35]	valid_0's auc: 0.77425
[36]	valid_0's auc: 0.773347
[37]	valid_0's auc: 0.772993
[38]	valid_0's auc: 0.772906
[39]	valid_0's auc: 0.772932
[40]	valid_0's auc: 0.773008
[41]	valid_0's auc: 0.772858
[42]	valid_0's auc: 0.773067
[43]	valid_0's auc: 0.773347
[44]	valid_0's auc: 0.773352
[45]	valid_0's auc: 0.773312
[46]	valid_0's auc: 0.773936
[47]	valid_0's auc: 0.773849
[48]	valid_0's auc: 0.773299
[49]	valid_0's auc: 0.773392
[50]	valid_0's auc: 0.77455
[51]	valid_0's auc: 0.774309
[52]	valid_0's auc: 0.774456
[53]	valid_0's auc: 0.774478
[54]	valid_0's auc: 0.774523
[55]	valid_0's auc: 0.774329
[56]	valid_0's auc: 0.774536
[57]	valid_0's auc: 0.774473
[58]	valid_0's auc: 0.774467
[59]	valid_0's auc: 0.774537
[60]	valid_0's auc: 0.774542
[61]	valid_0's auc: 0.774563
[62]	valid_0's auc: 0.77455
[63]	valid_0's auc: 0.77465
[64]	valid_0's auc: 0.774631
[65]	valid_0's auc: 0.774766
[66]	valid_0's auc: 0.774779
[67]	valid_0's auc: 0.774751
[68]	valid_0's auc: 0.774863
[69]	valid_0's auc: 0.774616
[70]	valid_0's auc: 0.774581
[71]	valid_0's auc: 0.774516
[72]	valid_0's auc: 0.774723
[73]	valid_0's auc: 0.77471
[74]	valid_0's auc: 0.774703
[75]	valid_0's auc: 0.7747
[76]	valid_0's auc: 0.774569
[77]	valid_0's auc: 0.77472
[78]	valid_0's auc: 0.77478
[79]	valid_0's auc: 0.77467
[80]	valid_0's auc: 0.774905
[81]	valid_0's auc: 0.774915
[82]	valid_0's auc: 0.774959
[83]	valid_0's auc: 0.774998
[84]	valid_0's auc: 0.775001
[85]	valid_0's auc: 0.775307
[86]	valid_0's auc: 0.775288
[87]	valid_0's auc: 0.775306
[88]	valid_0's auc: 0.775257
[89]	valid_0's auc: 0.775222
[90]	valid_0's auc: 0.775235
[91]	valid_0's auc: 0.775213
[92]	valid_0's auc: 0.775292
[93]	valid_0's auc: 0.775469
[94]	valid_0's auc: 0.775466
[95]	valid_0's auc: 0.775585
[96]	valid_0's auc: 0.775567
[97]	valid_0's auc: 0.775574
[98]	valid_0's auc: 0.77562
[99]	valid_0's auc: 0.775569
[100]	valid_0's auc: 0.775689
[1]	valid_0's auc: 0.757698
[2]	valid_0's auc: 0.757698
[3]	valid_0's auc: 0.758663
[4]	valid_0's auc: 0.757968
[5]	valid_0's auc: 0.758662
[6]	valid_0's auc: 0.75798
[7]	valid_0's auc: 0.75798
[8]	valid_0's auc: 0.758104
[9]	valid_0's auc: 0.757959
[10]	valid_0's auc: 0.758427
[11]	valid_0's auc: 0.758427
[12]	valid_0's auc: 0.758427
[13]	valid_0's auc: 0.758433
[14]	valid_0's auc: 0.758325
[15]	valid_0's auc: 0.758354
[16]	valid_0's auc: 0.760319
[17]	valid_0's auc: 0.760263
[18]	valid_0's auc: 0.760263
[19]	valid_0's auc: 0.760263
[20]	valid_0's auc: 0.760263
[21]	valid_0's auc: 0.760255
[22]	valid_0's auc: 0.760736
[23]	valid_0's auc: 0.760654
[24]	valid_0's auc: 0.760664
[25]	valid_0's auc: 0.766487
[26]	valid_0's auc: 0.766489
[27]	valid_0's auc: 0.766417
[28]	valid_0's auc: 0.766308
[29]	valid_0's auc: 0.766493
[30]	valid_0's auc: 0.76637
[31]	valid_0's auc: 0.766556
[32]	valid_0's auc: 0.766554
[33]	valid_0's auc: 0.766802
[34]	valid_0's auc: 0.766753
[35]	valid_0's auc: 0.766767
[36]	valid_0's auc: 0.76748
[37]	valid_0's auc: 0.767074
[38]	valid_0's auc: 0.768172
[39]	valid_0's auc: 0.767759
[40]	valid_0's auc: 0.768357
[41]	valid_0's auc: 0.768318
[42]	valid_0's auc: 0.768544
[43]	valid_0's auc: 0.768559
[44]	valid_0's auc: 0.768601
[45]	valid_0's auc: 0.768854
[46]	valid_0's auc: 0.76941
[47]	valid_0's auc: 0.769373
[48]	valid_0's auc: 0.769155
[49]	valid_0's auc: 0.769244
[50]	valid_0's auc: 0.769526
[51]	valid_0's auc: 0.76957
[52]	valid_0's auc: 0.769874
[53]	valid_0's auc: 0.769973
[54]	valid_0's auc: 0.769784
[55]	valid_0's auc: 0.769842
[56]	valid_0's auc: 0.770137
[57]	valid_0's auc: 0.770307
[58]	valid_0's auc: 0.770308
[59]	valid_0's auc: 0.770285
[60]	valid_0's auc: 0.769929
[61]	valid_0's auc: 0.770009
[62]	valid_0's auc: 0.770198
[63]	valid_0's auc: 0.770596
[64]	valid_0's auc: 0.770676
[65]	valid_0's auc: 0.770492
[66]	valid_0's auc: 0.770493
[67]	valid_0's auc: 0.770469
[68]	valid_0's auc: 0.770274
[69]	valid_0's auc: 0.770311
[70]	valid_0's auc: 0.770581
[71]	valid_0's auc: 0.770507
[72]	valid_0's auc: 0.770393
[73]	valid_0's auc: 0.770362
[74]	valid_0's auc: 0.770346
[75]	valid_0's auc: 0.770352
[76]	valid_0's auc: 0.770722
[77]	valid_0's auc: 0.770691
[78]	valid_0's auc: 0.770881
[79]	valid_0's auc: 0.770734
[80]	valid_0's auc: 0.770691
[81]	valid_0's auc: 0.77073
[82]	valid_0's auc: 0.770836
[83]	valid_0's auc: 0.770829
[84]	valid_0's auc: 0.771244
[85]	valid_0's auc: 0.771144
[86]	valid_0's auc: 0.771156
[87]	valid_0's auc: 0.771087
[88]	valid_0's auc: 0.771085
[89]	valid_0's auc: 0.771283
[90]	valid_0's auc: 0.771214
[91]	valid_0's auc: 0.771206
[92]	valid_0's auc: 0.771251
[93]	valid_0's auc: 0.77113
[94]	valid_0's auc: 0.771272
[95]	valid_0's auc: 0.770991
[96]	valid_0's auc: 0.771477
[97]	valid_0's auc: 0.771409
[98]	valid_0's auc: 0.771384
[99]	valid_0's auc: 0.771218
[100]	valid_0's auc: 0.771661
[1]	valid_0's auc: 0.760154
[2]	valid_0's auc: 0.760161
[3]	valid_0's auc: 0.760248
[4]	valid_0's auc: 0.760916
[5]	valid_0's auc: 0.760916
[6]	valid_0's auc: 0.761004
[7]	valid_0's auc: 0.761004
[8]	valid_0's auc: 0.761004
[9]	valid_0's auc: 0.761194
[10]	valid_0's auc: 0.761286
[11]	valid_0's auc: 0.761286
[12]	valid_0's auc: 0.761286
[13]	valid_0's auc: 0.761198
[14]	valid_0's auc: 0.761275
[15]	valid_0's auc: 0.761194
[16]	valid_0's auc: 0.761278
[17]	valid_0's auc: 0.761268
[18]	valid_0's auc: 0.761268
[19]	valid_0's auc: 0.761268
[20]	valid_0's auc: 0.761268
[21]	valid_0's auc: 0.761268
[22]	valid_0's auc: 0.761268
[23]	valid_0's auc: 0.761265
[24]	valid_0's auc: 0.76137
[25]	valid_0's auc: 0.76179
[26]	valid_0's auc: 0.76182
[27]	valid_0's auc: 0.761936
[28]	valid_0's auc: 0.761923
[29]	valid_0's auc: 0.761952
[30]	valid_0's auc: 0.761943
[31]	valid_0's auc: 0.761915
[32]	valid_0's auc: 0.761915
[33]	valid_0's auc: 0.76199
[34]	valid_0's auc: 0.76199
[35]	valid_0's auc: 0.76199
[36]	valid_0's auc: 0.761954
[37]	valid_0's auc: 0.763266
[38]	valid_0's auc: 0.763244
[39]	valid_0's auc: 0.763254
[40]	valid_0's auc: 0.763929
[41]	valid_0's auc: 0.763875
[42]	valid_0's auc: 0.763886
[43]	valid_0's auc: 0.764064
[44]	valid_0's auc: 0.764485
[45]	valid_0's auc: 0.764143
[46]	valid_0's auc: 0.764466
[47]	valid_0's auc: 0.764466
[48]	valid_0's auc: 0.764681
[49]	valid_0's auc: 0.764643
[50]	valid_0's auc: 0.764705
[51]	valid_0's auc: 0.764722
[52]	valid_0's auc: 0.767087
[53]	valid_0's auc: 0.767179
[54]	valid_0's auc: 0.767144
[55]	valid_0's auc: 0.767281
[56]	valid_0's auc: 0.767512
[57]	valid_0's auc: 0.767443
[58]	valid_0's auc: 0.767491
[59]	valid_0's auc: 0.76749
[60]	valid_0's auc: 0.767484
[61]	valid_0's auc: 0.76747
[62]	valid_0's auc: 0.767511
[63]	valid_0's auc: 0.767614
[64]	valid_0's auc: 0.767631
[65]	valid_0's auc: 0.767783
[66]	valid_0's auc: 0.767704
[67]	valid_0's auc: 0.767651
[68]	valid_0's auc: 0.767657
[69]	valid_0's auc: 0.767636
[70]	valid_0's auc: 0.767677
[71]	valid_0's auc: 0.767687
[72]	valid_0's auc: 0.767681
[73]	valid_0's auc: 0.767664
[74]	valid_0's auc: 0.767656
[75]	valid_0's auc: 0.76765
[76]	valid_0's auc: 0.767736
[77]	valid_0's auc: 0.767909
[78]	valid_0's auc: 0.768071
[79]	valid_0's auc: 0.768152
[80]	valid_0's auc: 0.768076
[81]	valid_0's auc: 0.768155
[82]	valid_0's auc: 0.768269
[83]	valid_0's auc: 0.768283
[84]	valid_0's auc: 0.76829
[85]	valid_0's auc: 0.768314
[86]	valid_0's auc: 0.768297
[87]	valid_0's auc: 0.768305
[88]	valid_0's auc: 0.768307
[89]	valid_0's auc: 0.76831
[90]	valid_0's auc: 0.768319
[91]	valid_0's auc: 0.76833
[92]	valid_0's auc: 0.768383
[93]	valid_0's auc: 0.76835
[94]	valid_0's auc: 0.768348
[95]	valid_0's auc: 0.768241
[96]	valid_0's auc: 0.768246
[97]	valid_0's auc: 0.768247
[98]	valid_0's auc: 0.768846
[99]	valid_0's auc: 0.76882
[100]	valid_0's auc: 0.768802
[1]	valid_0's auc: 0.759786
[2]	valid_0's auc: 0.764657
[3]	valid_0's auc: 0.764758
[4]	valid_0's auc: 0.764375
[5]	valid_0's auc: 0.764375
[6]	valid_0's auc: 0.766985
[7]	valid_0's auc: 0.766985
[8]	valid_0's auc: 0.767017
[9]	valid_0's auc: 0.767087
[10]	valid_0's auc: 0.767124
[11]	valid_0's auc: 0.767094
[12]	valid_0's auc: 0.767124
[13]	valid_0's auc: 0.768049
[14]	valid_0's auc: 0.768137
[15]	valid_0's auc: 0.768011
[16]	valid_0's auc: 0.768724
[17]	valid_0's auc: 0.768589
[18]	valid_0's auc: 0.768595
[19]	valid_0's auc: 0.768595
[20]	valid_0's auc: 0.768595
[21]	valid_0's auc: 0.768619
[22]	valid_0's auc: 0.768595
[23]	valid_0's auc: 0.768595
[24]	valid_0's auc: 0.768723
[25]	valid_0's auc: 0.768513
[26]	valid_0's auc: 0.768403
[27]	valid_0's auc: 0.768035
[28]	valid_0's auc: 0.768247
[29]	valid_0's auc: 0.768126
[30]	valid_0's auc: 0.768128
[31]	valid_0's auc: 0.768196
[32]	valid_0's auc: 0.768173
[33]	valid_0's auc: 0.768436
[34]	valid_0's auc: 0.76823
[35]	valid_0's auc: 0.768254
[36]	valid_0's auc: 0.768215
[37]	valid_0's auc: 0.768752
[38]	valid_0's auc: 0.769168
[39]	valid_0's auc: 0.769103
[40]	valid_0's auc: 0.769495
[41]	valid_0's auc: 0.769406
[42]	valid_0's auc: 0.769375
[43]	valid_0's auc: 0.769349
[44]	valid_0's auc: 0.769347
[45]	valid_0's auc: 0.769434
[46]	valid_0's auc: 0.769499
[47]	valid_0's auc: 0.769453
[48]	valid_0's auc: 0.769494
[49]	valid_0's auc: 0.769587
[50]	valid_0's auc: 0.769778
[51]	valid_0's auc: 0.769701
[52]	valid_0's auc: 0.769953
[53]	valid_0's auc: 0.769821
[54]	valid_0's auc: 0.770096
[55]	valid_0's auc: 0.77001
[56]	valid_0's auc: 0.77088
[57]	valid_0's auc: 0.772296
[58]	valid_0's auc: 0.772263
[59]	valid_0's auc: 0.772162
[60]	valid_0's auc: 0.77219
[61]	valid_0's auc: 0.772107
[62]	valid_0's auc: 0.772749
[63]	valid_0's auc: 0.772469
[64]	valid_0's auc: 0.772355
[65]	valid_0's auc: 0.773286
[66]	valid_0's auc: 0.773342
[67]	valid_0's auc: 0.773373
[68]	valid_0's auc: 0.77344
[69]	valid_0's auc: 0.773408
[70]	valid_0's auc: 0.773717
[71]	valid_0's auc: 0.773612
[72]	valid_0's auc: 0.773527
[73]	valid_0's auc: 0.773558
[74]	valid_0's auc: 0.773404
[75]	valid_0's auc: 0.773545
[76]	valid_0's auc: 0.773719
[77]	valid_0's auc: 0.773939
[78]	valid_0's auc: 0.773657
[79]	valid_0's auc: 0.774132
[80]	valid_0's auc: 0.774251
[81]	valid_0's auc: 0.775493
[82]	valid_0's auc: 0.775668
[83]	valid_0's auc: 0.775637
[84]	valid_0's auc: 0.776247
[85]	valid_0's auc: 0.776914
[86]	valid_0's auc: 0.776719
[87]	valid_0's auc: 0.776801
[88]	valid_0's auc: 0.776778
[89]	valid_0's auc: 0.77678
[90]	valid_0's auc: 0.776568
[91]	valid_0's auc: 0.776637
[92]	valid_0's auc: 0.77673
[93]	valid_0's auc: 0.776782
[94]	valid_0's auc: 0.776763
[95]	valid_0's auc: 0.776884
[96]	valid_0's auc: 0.77733
[97]	valid_0's auc: 0.777335
[98]	valid_0's auc: 0.777751
[99]	valid_0's auc: 0.777748
[100]	valid_0's auc: 0.778022
[1]	valid_0's auc: 0.762215
[2]	valid_0's auc: 0.772303
[3]	valid_0's auc: 0.772421
[4]	valid_0's auc: 0.776396
[5]	valid_0's auc: 0.776396
[6]	valid_0's auc: 0.776034
[7]	valid_0's auc: 0.776034
[8]	valid_0's auc: 0.776073
[9]	valid_0's auc: 0.775492
[10]	valid_0's auc: 0.775158
[11]	valid_0's auc: 0.775186
[12]	valid_0's auc: 0.775138
[13]	valid_0's auc: 0.775428
[14]	valid_0's auc: 0.776102
[15]	valid_0's auc: 0.776007
[16]	valid_0's auc: 0.776659
[17]	valid_0's auc: 0.776328
[18]	valid_0's auc: 0.776428
[19]	valid_0's auc: 0.776702
[20]	valid_0's auc: 0.776702
[21]	valid_0's auc: 0.776649
[22]	valid_0's auc: 0.776457
[23]	valid_0's auc: 0.776534
[24]	valid_0's auc: 0.77625
[25]	valid_0's auc: 0.776649
[26]	valid_0's auc: 0.776991
[27]	valid_0's auc: 0.77737
[28]	valid_0's auc: 0.778308
[29]	valid_0's auc: 0.778112
[30]	valid_0's auc: 0.778101
[31]	valid_0's auc: 0.778954
[32]	valid_0's auc: 0.778571
[33]	valid_0's auc: 0.778402
[34]	valid_0's auc: 0.778444
[35]	valid_0's auc: 0.778519
[36]	valid_0's auc: 0.779053
[37]	valid_0's auc: 0.778855
[38]	valid_0's auc: 0.779286
[39]	valid_0's auc: 0.77918
[40]	valid_0's auc: 0.779893
[41]	valid_0's auc: 0.77988
[42]	valid_0's auc: 0.779792
[43]	valid_0's auc: 0.77996
[44]	valid_0's auc: 0.779959
[45]	valid_0's auc: 0.779746
[46]	valid_0's auc: 0.780106
[47]	valid_0's auc: 0.779928
[48]	valid_0's auc: 0.779702
[49]	valid_0's auc: 0.780486
[50]	valid_0's auc: 0.780543
[51]	valid_0's auc: 0.780377
[52]	valid_0's auc: 0.780245
[53]	valid_0's auc: 0.780296
[54]	valid_0's auc: 0.780477
[55]	valid_0's auc: 0.780359
[56]	valid_0's auc: 0.780587
[57]	valid_0's auc: 0.780592
[58]	valid_0's auc: 0.780519
[59]	valid_0's auc: 0.78048
[60]	valid_0's auc: 0.78066
[61]	valid_0's auc: 0.7806
[62]	valid_0's auc: 0.780529
[63]	valid_0's auc: 0.780512
[64]	valid_0's auc: 0.78051
[65]	valid_0's auc: 0.780494
[66]	valid_0's auc: 0.780505
[67]	valid_0's auc: 0.780639
[68]	valid_0's auc: 0.780643
[69]	valid_0's auc: 0.781405
[70]	valid_0's auc: 0.781379
[71]	valid_0's auc: 0.781365
[72]	valid_0's auc: 0.78147
[73]	valid_0's auc: 0.781439
[74]	valid_0's auc: 0.781477
[75]	valid_0's auc: 0.78146
[76]	valid_0's auc: 0.781614
[77]	valid_0's auc: 0.78155
[78]	valid_0's auc: 0.781541
[79]	valid_0's auc: 0.781483
[80]	valid_0's auc: 0.781614
[81]	valid_0's auc: 0.781545
[82]	valid_0's auc: 0.781525
[83]	valid_0's auc: 0.781526
[84]	valid_0's auc: 0.781457
[85]	valid_0's auc: 0.781331
[86]	valid_0's auc: 0.781399
[87]	valid_0's auc: 0.781426
[88]	valid_0's auc: 0.781474
[89]	valid_0's auc: 0.781477
[90]	valid_0's auc: 0.781462
[91]	valid_0's auc: 0.781498
[92]	valid_0's auc: 0.781466
[93]	valid_0's auc: 0.781557
[94]	valid_0's auc: 0.781488
[95]	valid_0's auc: 0.781433
[96]	valid_0's auc: 0.781656
[97]	valid_0's auc: 0.781655
[98]	valid_0's auc: 0.781619
[99]	valid_0's auc: 0.781606
[100]	valid_0's auc: 0.781655
[1]	valid_0's auc: 0.760477
[2]	valid_0's auc: 0.760477
[3]	valid_0's auc: 0.760477
[4]	valid_0's auc: 0.760477
[5]	valid_0's auc: 0.760477
[6]	valid_0's auc: 0.761103
[7]	valid_0's auc: 0.761103
[8]	valid_0's auc: 0.76106
[9]	valid_0's auc: 0.760922
[10]	valid_0's auc: 0.76187
[11]	valid_0's auc: 0.761742
[12]	valid_0's auc: 0.761742
[13]	valid_0's auc: 0.76187
[14]	valid_0's auc: 0.761554
[15]	valid_0's auc: 0.762118
[16]	valid_0's auc: 0.761546
[17]	valid_0's auc: 0.762863
[18]	valid_0's auc: 0.762864
[19]	valid_0's auc: 0.762864
[20]	valid_0's auc: 0.762864
[21]	valid_0's auc: 0.762727
[22]	valid_0's auc: 0.762862
[23]	valid_0's auc: 0.76289
[24]	valid_0's auc: 0.763555
[25]	valid_0's auc: 0.769488
[26]	valid_0's auc: 0.76942
[27]	valid_0's auc: 0.769341
[28]	valid_0's auc: 0.769035
[29]	valid_0's auc: 0.769221
[30]	valid_0's auc: 0.769021
[31]	valid_0's auc: 0.769883
[32]	valid_0's auc: 0.769575
[33]	valid_0's auc: 0.769276
[34]	valid_0's auc: 0.769172
[35]	valid_0's auc: 0.769169
[36]	valid_0's auc: 0.768997
[37]	valid_0's auc: 0.769885
[38]	valid_0's auc: 0.770603
[39]	valid_0's auc: 0.770569
[40]	valid_0's auc: 0.770706
[41]	valid_0's auc: 0.770673
[42]	valid_0's auc: 0.770714
[43]	valid_0's auc: 0.771279
[44]	valid_0's auc: 0.771266
[45]	valid_0's auc: 0.771327
[46]	valid_0's auc: 0.771266
[47]	valid_0's auc: 0.771327
[48]	valid_0's auc: 0.771392
[49]	valid_0's auc: 0.771751
[50]	valid_0's auc: 0.771923
[51]	valid_0's auc: 0.771944
[52]	valid_0's auc: 0.772184
[53]	valid_0's auc: 0.772863
[54]	valid_0's auc: 0.772887
[55]	valid_0's auc: 0.772932
[56]	valid_0's auc: 0.7734
[57]	valid_0's auc: 0.773666
[58]	valid_0's auc: 0.773605
[59]	valid_0's auc: 0.773455
[60]	valid_0's auc: 0.773439
[61]	valid_0's auc: 0.773473
[62]	valid_0's auc: 0.773906
[63]	valid_0's auc: 0.773854
[64]	valid_0's auc: 0.773928
[65]	valid_0's auc: 0.773975
[66]	valid_0's auc: 0.773847
[67]	valid_0's auc: 0.773871
[68]	valid_0's auc: 0.773834
[69]	valid_0's auc: 0.773791
[70]	valid_0's auc: 0.773813
[71]	valid_0's auc: 0.773753
[72]	valid_0's auc: 0.774121
[73]	valid_0's auc: 0.774117
[74]	valid_0's auc: 0.774116
[75]	valid_0's auc: 0.773973
[76]	valid_0's auc: 0.773985
[77]	valid_0's auc: 0.773966
[78]	valid_0's auc: 0.774442
[79]	valid_0's auc: 0.774453
[80]	valid_0's auc: 0.774368
[81]	valid_0's auc: 0.774675
[82]	valid_0's auc: 0.774685
[83]	valid_0's auc: 0.774644
[84]	valid_0's auc: 0.774938
[85]	valid_0's auc: 0.775045
[86]	valid_0's auc: 0.775054
[87]	valid_0's auc: 0.775045
[88]	valid_0's auc: 0.775089
[89]	valid_0's auc: 0.775053
[90]	valid_0's auc: 0.775075
[91]	valid_0's auc: 0.775078
[92]	valid_0's auc: 0.774998
[93]	valid_0's auc: 0.775079
[94]	valid_0's auc: 0.775069
[95]	valid_0's auc: 0.775078
[96]	valid_0's auc: 0.775271
[97]	valid_0's auc: 0.775257
[98]	valid_0's auc: 0.77529
[99]	valid_0's auc: 0.775282
[100]	valid_0's auc: 0.775197
[1]	valid_0's auc: 0.7495
[2]	valid_0's auc: 0.7495
[3]	valid_0's auc: 0.7495
[4]	valid_0's auc: 0.751338
[5]	valid_0's auc: 0.751338
[6]	valid_0's auc: 0.750327
[7]	valid_0's auc: 0.750327
[8]	valid_0's auc: 0.750327
[9]	valid_0's auc: 0.753589
[10]	valid_0's auc: 0.751663
[11]	valid_0's auc: 0.751663
[12]	valid_0's auc: 0.751663
[13]	valid_0's auc: 0.753629
[14]	valid_0's auc: 0.751667
[15]	valid_0's auc: 0.753799
[16]	valid_0's auc: 0.75409
[17]	valid_0's auc: 0.75437
[18]	valid_0's auc: 0.754555
[19]	valid_0's auc: 0.754555
[20]	valid_0's auc: 0.75437
[21]	valid_0's auc: 0.75454
[22]	valid_0's auc: 0.754536
[23]	valid_0's auc: 0.754532
[24]	valid_0's auc: 0.754602
[25]	valid_0's auc: 0.760514
[26]	valid_0's auc: 0.760553
[27]	valid_0's auc: 0.760625
[28]	valid_0's auc: 0.76055
[29]	valid_0's auc: 0.760601
[30]	valid_0's auc: 0.760469
[31]	valid_0's auc: 0.76243
[32]	valid_0's auc: 0.762366
[33]	valid_0's auc: 0.76231
[34]	valid_0's auc: 0.762278
[35]	valid_0's auc: 0.762393
[36]	valid_0's auc: 0.762182
[37]	valid_0's auc: 0.762012
[38]	valid_0's auc: 0.762041
[39]	valid_0's auc: 0.762299
[40]	valid_0's auc: 0.762127
[41]	valid_0's auc: 0.762302
[42]	valid_0's auc: 0.762337
[43]	valid_0's auc: 0.76231
[44]	valid_0's auc: 0.762147
[45]	valid_0's auc: 0.76238
[46]	valid_0's auc: 0.762362
[47]	valid_0's auc: 0.7624
[48]	valid_0's auc: 0.762364
[49]	valid_0's auc: 0.762933
[50]	valid_0's auc: 0.763672
[51]	valid_0's auc: 0.764058
[52]	valid_0's auc: 0.764455
[53]	valid_0's auc: 0.764269
[54]	valid_0's auc: 0.766175
[55]	valid_0's auc: 0.766913
[56]	valid_0's auc: 0.767065
[57]	valid_0's auc: 0.766834
[58]	valid_0's auc: 0.766894
[59]	valid_0's auc: 0.766915
[60]	valid_0's auc: 0.766971
[61]	valid_0's auc: 0.766952
[62]	valid_0's auc: 0.767665
[63]	valid_0's auc: 0.767655
[64]	valid_0's auc: 0.767635
[65]	valid_0's auc: 0.767597
[66]	valid_0's auc: 0.767546
[67]	valid_0's auc: 0.767567
[68]	valid_0's auc: 0.76756
[69]	valid_0's auc: 0.767479
[70]	valid_0's auc: 0.767496
[71]	valid_0's auc: 0.7675
[72]	valid_0's auc: 0.767763
[73]	valid_0's auc: 0.768087
[74]	valid_0's auc: 0.76767
[75]	valid_0's auc: 0.76772
[76]	valid_0's auc: 0.767691
[77]	valid_0's auc: 0.767619
[78]	valid_0's auc: 0.767665
[79]	valid_0's auc: 0.768263
[80]	valid_0's auc: 0.768021
[81]	valid_0's auc: 0.768156
[82]	valid_0's auc: 0.769091
[83]	valid_0's auc: 0.769085
[84]	valid_0's auc: 0.769005
[85]	valid_0's auc: 0.769298
[86]	valid_0's auc: 0.769249
[87]	valid_0's auc: 0.769219
[88]	valid_0's auc: 0.769183
[89]	valid_0's auc: 0.769129
[90]	valid_0's auc: 0.769701
[91]	valid_0's auc: 0.769705
[92]	valid_0's auc: 0.769733
[93]	valid_0's auc: 0.769842
[94]	valid_0's auc: 0.769559
[95]	valid_0's auc: 0.769506
[96]	valid_0's auc: 0.769805
[97]	valid_0's auc: 0.769775
[98]	valid_0's auc: 0.769843
[99]	valid_0's auc: 0.769828
[100]	valid_0's auc: 0.769725
[1]	valid_0's auc: 0.748453
[2]	valid_0's auc: 0.750923
[3]	valid_0's auc: 0.75159
[4]	valid_0's auc: 0.752015
[5]	valid_0's auc: 0.756456
[6]	valid_0's auc: 0.757053
[7]	valid_0's auc: 0.757053
[8]	valid_0's auc: 0.757245
[9]	valid_0's auc: 0.756643
[10]	valid_0's auc: 0.756789
[11]	valid_0's auc: 0.756791
[12]	valid_0's auc: 0.757546
[13]	valid_0's auc: 0.757041
[14]	valid_0's auc: 0.757916
[15]	valid_0's auc: 0.758323
[16]	valid_0's auc: 0.762649
[17]	valid_0's auc: 0.763888
[18]	valid_0's auc: 0.763851
[19]	valid_0's auc: 0.7643
[20]	valid_0's auc: 0.7643
[21]	valid_0's auc: 0.764526
[22]	valid_0's auc: 0.765621
[23]	valid_0's auc: 0.765736
[24]	valid_0's auc: 0.765354
[25]	valid_0's auc: 0.765948
[26]	valid_0's auc: 0.765554
[27]	valid_0's auc: 0.767708
[28]	valid_0's auc: 0.768403
[29]	valid_0's auc: 0.768837
[30]	valid_0's auc: 0.767932
[31]	valid_0's auc: 0.769135
[32]	valid_0's auc: 0.768762
[33]	valid_0's auc: 0.769205
[34]	valid_0's auc: 0.769184
[35]	valid_0's auc: 0.769103
[36]	valid_0's auc: 0.769351
[37]	valid_0's auc: 0.769706
[38]	valid_0's auc: 0.770011
[39]	valid_0's auc: 0.769793
[40]	valid_0's auc: 0.769716
[41]	valid_0's auc: 0.769734
[42]	valid_0's auc: 0.769777
[43]	valid_0's auc: 0.769948
[44]	valid_0's auc: 0.769965
[45]	valid_0's auc: 0.769943
[46]	valid_0's auc: 0.770255
[47]	valid_0's auc: 0.770166
[48]	valid_0's auc: 0.770441
[49]	valid_0's auc: 0.770272
[50]	valid_0's auc: 0.77048
[51]	valid_0's auc: 0.770307
[52]	valid_0's auc: 0.77023
[53]	valid_0's auc: 0.770333
[54]	valid_0's auc: 0.770719
[55]	valid_0's auc: 0.770651
[56]	valid_0's auc: 0.770888
[57]	valid_0's auc: 0.770661
[58]	valid_0's auc: 0.770679
[59]	valid_0's auc: 0.770734
[60]	valid_0's auc: 0.770969
[61]	valid_0's auc: 0.770957
[62]	valid_0's auc: 0.770821
[63]	valid_0's auc: 0.770921
[64]	valid_0's auc: 0.77134
[65]	valid_0's auc: 0.771506
[66]	valid_0's auc: 0.771477
[67]	valid_0's auc: 0.771467
[68]	valid_0's auc: 0.771463
[69]	valid_0's auc: 0.771513
[70]	valid_0's auc: 0.771571
[71]	valid_0's auc: 0.77151
[72]	valid_0's auc: 0.77134
[73]	valid_0's auc: 0.771352
[74]	valid_0's auc: 0.771424
[75]	valid_0's auc: 0.771336
[76]	valid_0's auc: 0.771164
[77]	valid_0's auc: 0.771425
[78]	valid_0's auc: 0.771437
[79]	valid_0's auc: 0.771328
[80]	valid_0's auc: 0.771823
[81]	valid_0's auc: 0.771911
[82]	valid_0's auc: 0.771851
[83]	valid_0's auc: 0.771943
[84]	valid_0's auc: 0.77203
[85]	valid_0's auc: 0.771763
[86]	valid_0's auc: 0.771761
[87]	valid_0's auc: 0.771757
[88]	valid_0's auc: 0.771803
[89]	valid_0's auc: 0.77181
[90]	valid_0's auc: 0.771788
[91]	valid_0's auc: 0.771799
[92]	valid_0's auc: 0.771791
[93]	valid_0's auc: 0.772506
[94]	valid_0's auc: 0.772504
[95]	valid_0's auc: 0.772567
[96]	valid_0's auc: 0.7724
[97]	valid_0's auc: 0.772422
[98]	valid_0's auc: 0.772325
[99]	valid_0's auc: 0.77207
[100]	valid_0's auc: 0.772598
[1]	valid_0's auc: 0.748867
[2]	valid_0's auc: 0.764626
[3]	valid_0's auc: 0.764062
[4]	valid_0's auc: 0.763174
[5]	valid_0's auc: 0.763174
[6]	valid_0's auc: 0.762373
[7]	valid_0's auc: 0.762373
[8]	valid_0's auc: 0.762362
[9]	valid_0's auc: 0.76208
[10]	valid_0's auc: 0.764271
[11]	valid_0's auc: 0.764385
[12]	valid_0's auc: 0.764821
[13]	valid_0's auc: 0.765442
[14]	valid_0's auc: 0.765184
[15]	valid_0's auc: 0.765028
[16]	valid_0's auc: 0.764415
[17]	valid_0's auc: 0.765623
[18]	valid_0's auc: 0.766556
[19]	valid_0's auc: 0.766645
[20]	valid_0's auc: 0.766614
[21]	valid_0's auc: 0.766746
[22]	valid_0's auc: 0.7666
[23]	valid_0's auc: 0.7666
[24]	valid_0's auc: 0.76649
[25]	valid_0's auc: 0.767107
[26]	valid_0's auc: 0.767174
[27]	valid_0's auc: 0.768623
[28]	valid_0's auc: 0.768909
[29]	valid_0's auc: 0.768838
[30]	valid_0's auc: 0.768888
[31]	valid_0's auc: 0.768729
[32]	valid_0's auc: 0.768839
[33]	valid_0's auc: 0.76912
[34]	valid_0's auc: 0.769078
[35]	valid_0's auc: 0.76903
[36]	valid_0's auc: 0.768857
[37]	valid_0's auc: 0.768625
[38]	valid_0's auc: 0.769566
[39]	valid_0's auc: 0.769595
[40]	valid_0's auc: 0.769485
[41]	valid_0's auc: 0.769462
[42]	valid_0's auc: 0.769607
[43]	valid_0's auc: 0.769984
[44]	valid_0's auc: 0.77014
[45]	valid_0's auc: 0.769858
[46]	valid_0's auc: 0.770436
[47]	valid_0's auc: 0.770434
[48]	valid_0's auc: 0.770289
[49]	valid_0's auc: 0.770654
[50]	valid_0's auc: 0.769558
[51]	valid_0's auc: 0.769724
[52]	valid_0's auc: 0.769717
[53]	valid_0's auc: 0.76963
[54]	valid_0's auc: 0.770038
[55]	valid_0's auc: 0.771136
[56]	valid_0's auc: 0.771297
[57]	valid_0's auc: 0.771245
[58]	valid_0's auc: 0.771251
[59]	valid_0's auc: 0.771149
[60]	valid_0's auc: 0.77153
[61]	valid_0's auc: 0.77158
[62]	valid_0's auc: 0.771379
[63]	valid_0's auc: 0.771845
[64]	valid_0's auc: 0.771886
[65]	valid_0's auc: 0.771767
[66]	valid_0's auc: 0.77168
[67]	valid_0's auc: 0.771708
[68]	valid_0's auc: 0.771782
[69]	valid_0's auc: 0.771657
[70]	valid_0's auc: 0.771589
[71]	valid_0's auc: 0.771596
[72]	valid_0's auc: 0.771916
[73]	valid_0's auc: 0.771913
[74]	valid_0's auc: 0.771938
[75]	valid_0's auc: 0.77189
[76]	valid_0's auc: 0.771804
[77]	valid_0's auc: 0.771921
[78]	valid_0's auc: 0.772233
[79]	valid_0's auc: 0.772134
[80]	valid_0's auc: 0.772308
[81]	valid_0's auc: 0.772456
[82]	valid_0's auc: 0.772534
[83]	valid_0's auc: 0.772563
[84]	valid_0's auc: 0.77248
[85]	valid_0's auc: 0.773027
[86]	valid_0's auc: 0.773028
[87]	valid_0's auc: 0.773038
[88]	valid_0's auc: 0.773018
[89]	valid_0's auc: 0.773081
[90]	valid_0's auc: 0.773037
[91]	valid_0's auc: 0.77304
[92]	valid_0's auc: 0.773228
[93]	valid_0's auc: 0.773083
[94]	valid_0's auc: 0.77306
[95]	valid_0's auc: 0.773082
[96]	valid_0's auc: 0.773302
[97]	valid_0's auc: 0.773207
[98]	valid_0's auc: 0.773282
[99]	valid_0's auc: 0.773215
[100]	valid_0's auc: 0.773306
[1]	valid_0's auc: 0.748286
[2]	valid_0's auc: 0.748643
[3]	valid_0's auc: 0.748974
[4]	valid_0's auc: 0.749498
[5]	valid_0's auc: 0.749498
[6]	valid_0's auc: 0.7492
[7]	valid_0's auc: 0.7492
[8]	valid_0's auc: 0.752539
[9]	valid_0's auc: 0.762203
[10]	valid_0's auc: 0.762197
[11]	valid_0's auc: 0.762188
[12]	valid_0's auc: 0.762281
[13]	valid_0's auc: 0.762658
[14]	valid_0's auc: 0.762467
[15]	valid_0's auc: 0.764294
[16]	valid_0's auc: 0.765254
[17]	valid_0's auc: 0.765911
[18]	valid_0's auc: 0.765918
[19]	valid_0's auc: 0.765884
[20]	valid_0's auc: 0.765884
[21]	valid_0's auc: 0.76579
[22]	valid_0's auc: 0.765663
[23]	valid_0's auc: 0.76566
[24]	valid_0's auc: 0.765605
[25]	valid_0's auc: 0.766181
[26]	valid_0's auc: 0.766276
[27]	valid_0's auc: 0.76655
[28]	valid_0's auc: 0.766088
[29]	valid_0's auc: 0.766655
[30]	valid_0's auc: 0.766075
[31]	valid_0's auc: 0.767689
[32]	valid_0's auc: 0.767668
[33]	valid_0's auc: 0.767533
[34]	valid_0's auc: 0.767118
[35]	valid_0's auc: 0.767054
[36]	valid_0's auc: 0.766499
[37]	valid_0's auc: 0.767457
[38]	valid_0's auc: 0.767132
[39]	valid_0's auc: 0.76716
[40]	valid_0's auc: 0.767097
[41]	valid_0's auc: 0.766949
[42]	valid_0's auc: 0.76775
[43]	valid_0's auc: 0.768144
[44]	valid_0's auc: 0.768126
[45]	valid_0's auc: 0.767837
[46]	valid_0's auc: 0.768498
[47]	valid_0's auc: 0.768361
[48]	valid_0's auc: 0.768177
[49]	valid_0's auc: 0.768134
[50]	valid_0's auc: 0.768357
[51]	valid_0's auc: 0.768274
[52]	valid_0's auc: 0.768282
[53]	valid_0's auc: 0.769519
[54]	valid_0's auc: 0.769432
[55]	valid_0's auc: 0.769116
[56]	valid_0's auc: 0.769951
[57]	valid_0's auc: 0.769814
[58]	valid_0's auc: 0.769683
[59]	valid_0's auc: 0.769728
[60]	valid_0's auc: 0.769657
[61]	valid_0's auc: 0.769657
[62]	valid_0's auc: 0.770357
[63]	valid_0's auc: 0.770351
[64]	valid_0's auc: 0.770457
[65]	valid_0's auc: 0.770513
[66]	valid_0's auc: 0.770786
[67]	valid_0's auc: 0.770685
[68]	valid_0's auc: 0.770703
[69]	valid_0's auc: 0.770468
[70]	valid_0's auc: 0.770401
[71]	valid_0's auc: 0.770392
[72]	valid_0's auc: 0.770924
[73]	valid_0's auc: 0.770897
[74]	valid_0's auc: 0.770976
[75]	valid_0's auc: 0.770904
[76]	valid_0's auc: 0.770828
[77]	valid_0's auc: 0.770694
[78]	valid_0's auc: 0.771128
[79]	valid_0's auc: 0.770851
[80]	valid_0's auc: 0.770718
[81]	valid_0's auc: 0.771544
[82]	valid_0's auc: 0.771195
[83]	valid_0's auc: 0.771197
[84]	valid_0's auc: 0.771199
[85]	valid_0's auc: 0.77162
[86]	valid_0's auc: 0.771434
[87]	valid_0's auc: 0.771354
[88]	valid_0's auc: 0.771411
[89]	valid_0's auc: 0.771386
[90]	valid_0's auc: 0.771444
[91]	valid_0's auc: 0.771538
[92]	valid_0's auc: 0.771342
[93]	valid_0's auc: 0.771303
[94]	valid_0's auc: 0.771238
[95]	valid_0's auc: 0.771726
[96]	valid_0's auc: 0.771687
[97]	valid_0's auc: 0.771616
[98]	valid_0's auc: 0.771642
[99]	valid_0's auc: 0.771567
[100]	valid_0's auc: 0.771987
[1]	valid_0's auc: 0.757043
[2]	valid_0's auc: 0.757043
[3]	valid_0's auc: 0.757043
[4]	valid_0's auc: 0.757043
[5]	valid_0's auc: 0.757043
[6]	valid_0's auc: 0.757043
[7]	valid_0's auc: 0.757043
[8]	valid_0's auc: 0.757043
[9]	valid_0's auc: 0.758795
[10]	valid_0's auc: 0.760475
[11]	valid_0's auc: 0.760597
[12]	valid_0's auc: 0.760584
[13]	valid_0's auc: 0.760681
[14]	valid_0's auc: 0.760683
[15]	valid_0's auc: 0.76056
[16]	valid_0's auc: 0.761402
[17]	valid_0's auc: 0.761529
[18]	valid_0's auc: 0.761529
[19]	valid_0's auc: 0.761591
[20]	valid_0's auc: 0.761595
[21]	valid_0's auc: 0.761558
[22]	valid_0's auc: 0.761546
[23]	valid_0's auc: 0.761522
[24]	valid_0's auc: 0.761983
[25]	valid_0's auc: 0.762505
[26]	valid_0's auc: 0.762508
[27]	valid_0's auc: 0.761965
[28]	valid_0's auc: 0.762515
[29]	valid_0's auc: 0.762438
[30]	valid_0's auc: 0.761944
[31]	valid_0's auc: 0.762352
[32]	valid_0's auc: 0.762362
[33]	valid_0's auc: 0.76204
[34]	valid_0's auc: 0.762014
[35]	valid_0's auc: 0.762034
[36]	valid_0's auc: 0.762299
[37]	valid_0's auc: 0.76202
[38]	valid_0's auc: 0.762389
[39]	valid_0's auc: 0.762389
[40]	valid_0's auc: 0.76257
[41]	valid_0's auc: 0.762487
[42]	valid_0's auc: 0.768379
[43]	valid_0's auc: 0.768593
[44]	valid_0's auc: 0.768641
[45]	valid_0's auc: 0.76864
[46]	valid_0's auc: 0.768701
[47]	valid_0's auc: 0.768685
[48]	valid_0's auc: 0.768725
[49]	valid_0's auc: 0.76878
[50]	valid_0's auc: 0.768815
[51]	valid_0's auc: 0.768899
[52]	valid_0's auc: 0.769356
[53]	valid_0's auc: 0.76957
[54]	valid_0's auc: 0.769592
[55]	valid_0's auc: 0.770163
[56]	valid_0's auc: 0.770087
[57]	valid_0's auc: 0.769987
[58]	valid_0's auc: 0.770034
[59]	valid_0's auc: 0.770034
[60]	valid_0's auc: 0.770112
[61]	valid_0's auc: 0.770101
[62]	valid_0's auc: 0.77021
[63]	valid_0's auc: 0.770568
[64]	valid_0's auc: 0.770577
[65]	valid_0's auc: 0.770507
[66]	valid_0's auc: 0.770555
[67]	valid_0's auc: 0.770527
[68]	valid_0's auc: 0.770536
[69]	valid_0's auc: 0.770528
[70]	valid_0's auc: 0.770708
[71]	valid_0's auc: 0.770806
[72]	valid_0's auc: 0.770551
[73]	valid_0's auc: 0.770486
[74]	valid_0's auc: 0.770564
[75]	valid_0's auc: 0.770553
[76]	valid_0's auc: 0.770568
[77]	valid_0's auc: 0.771518
[78]	valid_0's auc: 0.772063
[79]	valid_0's auc: 0.772278
[80]	valid_0's auc: 0.772168
[81]	valid_0's auc: 0.772335
[82]	valid_0's auc: 0.772699
[83]	valid_0's auc: 0.772683
[84]	valid_0's auc: 0.772628
[85]	valid_0's auc: 0.772676
[86]	valid_0's auc: 0.772657
[87]	valid_0's auc: 0.772683
[88]	valid_0's auc: 0.772761
[89]	valid_0's auc: 0.772677
[90]	valid_0's auc: 0.772599
[91]	valid_0's auc: 0.772597
[92]	valid_0's auc: 0.772742
[93]	valid_0's auc: 0.772798
[94]	valid_0's auc: 0.772787
[95]	valid_0's auc: 0.772724
[96]	valid_0's auc: 0.773257
[97]	valid_0's auc: 0.773116
[98]	valid_0's auc: 0.773239
[99]	valid_0's auc: 0.773173
[100]	valid_0's auc: 0.772973
[1]	valid_0's auc: 0.752298
[2]	valid_0's auc: 0.759276
[3]	valid_0's auc: 0.759067
[4]	valid_0's auc: 0.760567
[5]	valid_0's auc: 0.760567
[6]	valid_0's auc: 0.759938
[7]	valid_0's auc: 0.759938
[8]	valid_0's auc: 0.760226
[9]	valid_0's auc: 0.762248
[10]	valid_0's auc: 0.76367
[11]	valid_0's auc: 0.763621
[12]	valid_0's auc: 0.763836
[13]	valid_0's auc: 0.764138
[14]	valid_0's auc: 0.763808
[15]	valid_0's auc: 0.763779
[16]	valid_0's auc: 0.764396
[17]	valid_0's auc: 0.76465
[18]	valid_0's auc: 0.764675
[19]	valid_0's auc: 0.764672
[20]	valid_0's auc: 0.764672
[21]	valid_0's auc: 0.764679
[22]	valid_0's auc: 0.764654
[23]	valid_0's auc: 0.764654
[24]	valid_0's auc: 0.764862
[25]	valid_0's auc: 0.767665
[26]	valid_0's auc: 0.767657
[27]	valid_0's auc: 0.76752
[28]	valid_0's auc: 0.768079
[29]	valid_0's auc: 0.767723
[30]	valid_0's auc: 0.767753
[31]	valid_0's auc: 0.767904
[32]	valid_0's auc: 0.767735
[33]	valid_0's auc: 0.767987
[34]	valid_0's auc: 0.767913
[35]	valid_0's auc: 0.767958
[36]	valid_0's auc: 0.767726
[37]	valid_0's auc: 0.77144
[38]	valid_0's auc: 0.771983
[39]	valid_0's auc: 0.772072
[40]	valid_0's auc: 0.772642
[41]	valid_0's auc: 0.7724
[42]	valid_0's auc: 0.772606
[43]	valid_0's auc: 0.773112
[44]	valid_0's auc: 0.773262
[45]	valid_0's auc: 0.773283
[46]	valid_0's auc: 0.7736
[47]	valid_0's auc: 0.773597
[48]	valid_0's auc: 0.774146
[49]	valid_0's auc: 0.77475
[50]	valid_0's auc: 0.775632
[51]	valid_0's auc: 0.775868
[52]	valid_0's auc: 0.776302
[53]	valid_0's auc: 0.776677
[54]	valid_0's auc: 0.776795
[55]	valid_0's auc: 0.777018
[56]	valid_0's auc: 0.776968
[57]	valid_0's auc: 0.7774
[58]	valid_0's auc: 0.77728
[59]	valid_0's auc: 0.777279
[60]	valid_0's auc: 0.777589
[61]	valid_0's auc: 0.777624
[62]	valid_0's auc: 0.777569
[63]	valid_0's auc: 0.777958
[64]	valid_0's auc: 0.778038
[65]	valid_0's auc: 0.777877
[66]	valid_0's auc: 0.777922
[67]	valid_0's auc: 0.777913
[68]	valid_0's auc: 0.777988
[69]	valid_0's auc: 0.778129
[70]	valid_0's auc: 0.778129
[71]	valid_0's auc: 0.778101
[72]	valid_0's auc: 0.778072
[73]	valid_0's auc: 0.778109
[74]	valid_0's auc: 0.778116
[75]	valid_0's auc: 0.778077
[76]	valid_0's auc: 0.778073
[77]	valid_0's auc: 0.778366
[78]	valid_0's auc: 0.778557
[79]	valid_0's auc: 0.778774
[80]	valid_0's auc: 0.778649
[81]	valid_0's auc: 0.778791
[82]	valid_0's auc: 0.779486
[83]	valid_0's auc: 0.779434
[84]	valid_0's auc: 0.779476
[85]	valid_0's auc: 0.779434
[86]	valid_0's auc: 0.779464
[87]	valid_0's auc: 0.779465
[88]	valid_0's auc: 0.779408
[89]	valid_0's auc: 0.779419
[90]	valid_0's auc: 0.779439
[91]	valid_0's auc: 0.779496
[92]	valid_0's auc: 0.779354
[93]	valid_0's auc: 0.779674
[94]	valid_0's auc: 0.779704
[95]	valid_0's auc: 0.779594
[96]	valid_0's auc: 0.779564
[97]	valid_0's auc: 0.779641
[98]	valid_0's auc: 0.779724
[99]	valid_0's auc: 0.779738
[100]	valid_0's auc: 0.779794
[1]	valid_0's auc: 0.754636
[2]	valid_0's auc: 0.768299
[3]	valid_0's auc: 0.771575
[4]	valid_0's auc: 0.770665
[5]	valid_0's auc: 0.770665
[6]	valid_0's auc: 0.772371
[7]	valid_0's auc: 0.772371
[8]	valid_0's auc: 0.772084
[9]	valid_0's auc: 0.771631
[10]	valid_0's auc: 0.771184
[11]	valid_0's auc: 0.771361
[12]	valid_0's auc: 0.771364
[13]	valid_0's auc: 0.771236
[14]	valid_0's auc: 0.77209
[15]	valid_0's auc: 0.771527
[16]	valid_0's auc: 0.773395
[17]	valid_0's auc: 0.772813
[18]	valid_0's auc: 0.772593
[19]	valid_0's auc: 0.772593
[20]	valid_0's auc: 0.772593
[21]	valid_0's auc: 0.772756
[22]	valid_0's auc: 0.772602
[23]	valid_0's auc: 0.77261
[24]	valid_0's auc: 0.772405
[25]	valid_0's auc: 0.772337
[26]	valid_0's auc: 0.773139
[27]	valid_0's auc: 0.774044
[28]	valid_0's auc: 0.774584
[29]	valid_0's auc: 0.774653
[30]	valid_0's auc: 0.774633
[31]	valid_0's auc: 0.775546
[32]	valid_0's auc: 0.775245
[33]	valid_0's auc: 0.775
[34]	valid_0's auc: 0.774977
[35]	valid_0's auc: 0.774951
[36]	valid_0's auc: 0.774779
[37]	valid_0's auc: 0.774521
[38]	valid_0's auc: 0.775045
[39]	valid_0's auc: 0.775058
[40]	valid_0's auc: 0.774883
[41]	valid_0's auc: 0.774926
[42]	valid_0's auc: 0.775126
[43]	valid_0's auc: 0.775779
[44]	valid_0's auc: 0.77582
[45]	valid_0's auc: 0.775998
[46]	valid_0's auc: 0.775977
[47]	valid_0's auc: 0.775857
[48]	valid_0's auc: 0.775963
[49]	valid_0's auc: 0.77571
[50]	valid_0's auc: 0.77598
[51]	valid_0's auc: 0.776138
[52]	valid_0's auc: 0.776229
[53]	valid_0's auc: 0.776544
[54]	valid_0's auc: 0.777067
[55]	valid_0's auc: 0.776815
[56]	valid_0's auc: 0.776867
[57]	valid_0's auc: 0.777088
[58]	valid_0's auc: 0.77714
[59]	valid_0's auc: 0.777164
[60]	valid_0's auc: 0.777809
[61]	valid_0's auc: 0.777681
[62]	valid_0's auc: 0.777586
[63]	valid_0's auc: 0.777788
[64]	valid_0's auc: 0.777767
[65]	valid_0's auc: 0.778097
[66]	valid_0's auc: 0.77814
[67]	valid_0's auc: 0.778187
[68]	valid_0's auc: 0.778159
[69]	valid_0's auc: 0.778054
[70]	valid_0's auc: 0.778241
[71]	valid_0's auc: 0.778186
[72]	valid_0's auc: 0.778181
[73]	valid_0's auc: 0.778076
[74]	valid_0's auc: 0.778134
[75]	valid_0's auc: 0.778119
[76]	valid_0's auc: 0.778214
[77]	valid_0's auc: 0.778572
[78]	valid_0's auc: 0.779137
[79]	valid_0's auc: 0.778959
[80]	valid_0's auc: 0.77866
[81]	valid_0's auc: 0.779315
[82]	valid_0's auc: 0.779205
[83]	valid_0's auc: 0.77921
[84]	valid_0's auc: 0.779256
[85]	valid_0's auc: 0.778944
[86]	valid_0's auc: 0.778859
[87]	valid_0's auc: 0.77897
[88]	valid_0's auc: 0.779201
[89]	valid_0's auc: 0.779222
[90]	valid_0's auc: 0.779197
[91]	valid_0's auc: 0.779267
[92]	valid_0's auc: 0.778952
[93]	valid_0's auc: 0.778946
[94]	valid_0's auc: 0.778927
[95]	valid_0's auc: 0.779292
[96]	valid_0's auc: 0.77937
[97]	valid_0's auc: 0.779374
[98]	valid_0's auc: 0.779003
[99]	valid_0's auc: 0.779013
[100]	valid_0's auc: 0.778886
[1]	valid_0's auc: 0.757698
[2]	valid_0's auc: 0.757948
[3]	valid_0's auc: 0.758047
[4]	valid_0's auc: 0.758753
[5]	valid_0's auc: 0.758732
[6]	valid_0's auc: 0.761583
[7]	valid_0's auc: 0.761583
[8]	valid_0's auc: 0.761034
[9]	valid_0's auc: 0.762049
[10]	valid_0's auc: 0.762074
[11]	valid_0's auc: 0.762198
[12]	valid_0's auc: 0.762199
[13]	valid_0's auc: 0.767967
[14]	valid_0's auc: 0.767981
[15]	valid_0's auc: 0.767899
[16]	valid_0's auc: 0.767954
[17]	valid_0's auc: 0.769467
[18]	valid_0's auc: 0.769501
[19]	valid_0's auc: 0.769247
[20]	valid_0's auc: 0.769247
[21]	valid_0's auc: 0.769322
[22]	valid_0's auc: 0.769347
[23]	valid_0's auc: 0.769282
[24]	valid_0's auc: 0.769233
[25]	valid_0's auc: 0.769509
[26]	valid_0's auc: 0.769618
[27]	valid_0's auc: 0.770013
[28]	valid_0's auc: 0.770055
[29]	valid_0's auc: 0.770067
[30]	valid_0's auc: 0.76999
[31]	valid_0's auc: 0.770295
[32]	valid_0's auc: 0.770246
[33]	valid_0's auc: 0.770121
[34]	valid_0's auc: 0.770037
[35]	valid_0's auc: 0.770074
[36]	valid_0's auc: 0.770297
[37]	valid_0's auc: 0.77025
[38]	valid_0's auc: 0.77052
[39]	valid_0's auc: 0.770531
[40]	valid_0's auc: 0.770241
[41]	valid_0's auc: 0.770249
[42]	valid_0's auc: 0.770513
[43]	valid_0's auc: 0.771491
[44]	valid_0's auc: 0.771462
[45]	valid_0's auc: 0.771853
[46]	valid_0's auc: 0.771821
[47]	valid_0's auc: 0.771835
[48]	valid_0's auc: 0.772956
[49]	valid_0's auc: 0.772811
[50]	valid_0's auc: 0.772768
[51]	valid_0's auc: 0.773013
[52]	valid_0's auc: 0.773168
[53]	valid_0's auc: 0.772969
[54]	valid_0's auc: 0.773559
[55]	valid_0's auc: 0.773395
[56]	valid_0's auc: 0.773962
[57]	valid_0's auc: 0.773393
[58]	valid_0's auc: 0.773411
[59]	valid_0's auc: 0.773418
[60]	valid_0's auc: 0.773612
[61]	valid_0's auc: 0.773587
[62]	valid_0's auc: 0.774549
[63]	valid_0's auc: 0.775026
[64]	valid_0's auc: 0.775062
[65]	valid_0's auc: 0.775142
[66]	valid_0's auc: 0.775147
[67]	valid_0's auc: 0.775173
[68]	valid_0's auc: 0.775176
[69]	valid_0's auc: 0.774918
[70]	valid_0's auc: 0.775034
[71]	valid_0's auc: 0.77502
[72]	valid_0's auc: 0.775638
[73]	valid_0's auc: 0.775681
[74]	valid_0's auc: 0.775659
[75]	valid_0's auc: 0.775646
[76]	valid_0's auc: 0.775464
[77]	valid_0's auc: 0.775468
[78]	valid_0's auc: 0.776185
[79]	valid_0's auc: 0.776177
[80]	valid_0's auc: 0.776149
[81]	valid_0's auc: 0.776095
[82]	valid_0's auc: 0.77591
[83]	valid_0's auc: 0.77598
[84]	valid_0's auc: 0.77593
[85]	valid_0's auc: 0.776513
[86]	valid_0's auc: 0.776489
[87]	valid_0's auc: 0.776508
[88]	valid_0's auc: 0.776496
[89]	valid_0's auc: 0.776483
[90]	valid_0's auc: 0.776364
[91]	valid_0's auc: 0.776394
[92]	valid_0's auc: 0.776189
[93]	valid_0's auc: 0.776245
[94]	valid_0's auc: 0.776325
[95]	valid_0's auc: 0.776434
[96]	valid_0's auc: 0.776475
[97]	valid_0's auc: 0.776512
[98]	valid_0's auc: 0.776745
[99]	valid_0's auc: 0.776682
[100]	valid_0's auc: 0.776691
[1]	valid_0's auc: 0.760154
[2]	valid_0's auc: 0.760161
[3]	valid_0's auc: 0.761275
[4]	valid_0's auc: 0.761194
[5]	valid_0's auc: 0.761194
[6]	valid_0's auc: 0.761366
[7]	valid_0's auc: 0.761366
[8]	valid_0's auc: 0.761278
[9]	valid_0's auc: 0.761437
[10]	valid_0's auc: 0.763737
[11]	valid_0's auc: 0.763776
[12]	valid_0's auc: 0.763719
[13]	valid_0's auc: 0.76376
[14]	valid_0's auc: 0.765324
[15]	valid_0's auc: 0.764771
[16]	valid_0's auc: 0.765004
[17]	valid_0's auc: 0.765143
[18]	valid_0's auc: 0.765139
[19]	valid_0's auc: 0.766475
[20]	valid_0's auc: 0.766475
[21]	valid_0's auc: 0.766487
[22]	valid_0's auc: 0.766607
[23]	valid_0's auc: 0.766948
[24]	valid_0's auc: 0.766969
[25]	valid_0's auc: 0.767293
[26]	valid_0's auc: 0.76715
[27]	valid_0's auc: 0.767735
[28]	valid_0's auc: 0.767482
[29]	valid_0's auc: 0.767415
[30]	valid_0's auc: 0.767373
[31]	valid_0's auc: 0.767466
[32]	valid_0's auc: 0.767447
[33]	valid_0's auc: 0.767597
[34]	valid_0's auc: 0.767592
[35]	valid_0's auc: 0.767583
[36]	valid_0's auc: 0.767535
[37]	valid_0's auc: 0.767569
[38]	valid_0's auc: 0.767951
[39]	valid_0's auc: 0.767957
[40]	valid_0's auc: 0.767971
[41]	valid_0's auc: 0.767935
[42]	valid_0's auc: 0.772863
[43]	valid_0's auc: 0.772794
[44]	valid_0's auc: 0.772807
[45]	valid_0's auc: 0.772689
[46]	valid_0's auc: 0.773554
[47]	valid_0's auc: 0.772882
[48]	valid_0's auc: 0.773191
[49]	valid_0's auc: 0.773343
[50]	valid_0's auc: 0.773406
[51]	valid_0's auc: 0.773419
[52]	valid_0's auc: 0.773489
[53]	valid_0's auc: 0.773811
[54]	valid_0's auc: 0.774199
[55]	valid_0's auc: 0.774412
[56]	valid_0's auc: 0.774471
[57]	valid_0's auc: 0.774837
[58]	valid_0's auc: 0.774854
[59]	valid_0's auc: 0.774845
[60]	valid_0's auc: 0.775037
[61]	valid_0's auc: 0.775005
[62]	valid_0's auc: 0.775253
[63]	valid_0's auc: 0.775098
[64]	valid_0's auc: 0.775211
[65]	valid_0's auc: 0.775622
[66]	valid_0's auc: 0.775587
[67]	valid_0's auc: 0.775568
[68]	valid_0's auc: 0.775616
[69]	valid_0's auc: 0.775411
[70]	valid_0's auc: 0.776291
[71]	valid_0's auc: 0.776298
[72]	valid_0's auc: 0.776501
[73]	valid_0's auc: 0.77645
[74]	valid_0's auc: 0.776494
[75]	valid_0's auc: 0.776403
[76]	valid_0's auc: 0.775687
[77]	valid_0's auc: 0.77577
[78]	valid_0's auc: 0.776862
[79]	valid_0's auc: 0.776278
[80]	valid_0's auc: 0.777233
[81]	valid_0's auc: 0.777851
[82]	valid_0's auc: 0.778318
[83]	valid_0's auc: 0.778309
[84]	valid_0's auc: 0.778293
[85]	valid_0's auc: 0.778611
[86]	valid_0's auc: 0.778525
[87]	valid_0's auc: 0.77854
[88]	valid_0's auc: 0.778512
[89]	valid_0's auc: 0.778519
[90]	valid_0's auc: 0.778525
[91]	valid_0's auc: 0.778497
[92]	valid_0's auc: 0.778285
[93]	valid_0's auc: 0.778654
[94]	valid_0's auc: 0.77857
[95]	valid_0's auc: 0.778521
[96]	valid_0's auc: 0.778782
[97]	valid_0's auc: 0.778835
[98]	valid_0's auc: 0.779212
[99]	valid_0's auc: 0.779191
[100]	valid_0's auc: 0.779226
[1]	valid_0's auc: 0.759786
[2]	valid_0's auc: 0.764657
[3]	valid_0's auc: 0.765416
[4]	valid_0's auc: 0.765122
[5]	valid_0's auc: 0.765122
[6]	valid_0's auc: 0.76737
[7]	valid_0's auc: 0.76737
[8]	valid_0's auc: 0.768347
[9]	valid_0's auc: 0.768542
[10]	valid_0's auc: 0.769538
[11]	valid_0's auc: 0.769651
[12]	valid_0's auc: 0.769697
[13]	valid_0's auc: 0.769702
[14]	valid_0's auc: 0.769586
[15]	valid_0's auc: 0.769466
[16]	valid_0's auc: 0.769774
[17]	valid_0's auc: 0.769894
[18]	valid_0's auc: 0.769968
[19]	valid_0's auc: 0.769984
[20]	valid_0's auc: 0.769984
[21]	valid_0's auc: 0.769921
[22]	valid_0's auc: 0.770246
[23]	valid_0's auc: 0.770348
[24]	valid_0's auc: 0.770537
[25]	valid_0's auc: 0.772107
[26]	valid_0's auc: 0.772154
[27]	valid_0's auc: 0.77229
[28]	valid_0's auc: 0.772606
[29]	valid_0's auc: 0.772221
[30]	valid_0's auc: 0.77217
[31]	valid_0's auc: 0.772283
[32]	valid_0's auc: 0.772148
[33]	valid_0's auc: 0.773711
[34]	valid_0's auc: 0.773201
[35]	valid_0's auc: 0.772608
[36]	valid_0's auc: 0.773908
[37]	valid_0's auc: 0.774534
[38]	valid_0's auc: 0.774751
[39]	valid_0's auc: 0.774889
[40]	valid_0's auc: 0.776626
[41]	valid_0's auc: 0.776385
[42]	valid_0's auc: 0.776264
[43]	valid_0's auc: 0.776878
[44]	valid_0's auc: 0.777016
[45]	valid_0's auc: 0.777825
[46]	valid_0's auc: 0.778712
[47]	valid_0's auc: 0.778604
[48]	valid_0's auc: 0.778727
[49]	valid_0's auc: 0.779104
[50]	valid_0's auc: 0.779935
[51]	valid_0's auc: 0.78035
[52]	valid_0's auc: 0.781057
[53]	valid_0's auc: 0.781113
[54]	valid_0's auc: 0.781424
[55]	valid_0's auc: 0.781814
[56]	valid_0's auc: 0.781871
[57]	valid_0's auc: 0.782098
[58]	valid_0's auc: 0.782113
[59]	valid_0's auc: 0.782133
[60]	valid_0's auc: 0.782569
[61]	valid_0's auc: 0.782645
[62]	valid_0's auc: 0.782596
[63]	valid_0's auc: 0.782709
[64]	valid_0's auc: 0.782761
[65]	valid_0's auc: 0.78289
[66]	valid_0's auc: 0.782825
[67]	valid_0's auc: 0.782821
[68]	valid_0's auc: 0.782796
[69]	valid_0's auc: 0.783017
[70]	valid_0's auc: 0.783031
[71]	valid_0's auc: 0.783049
[72]	valid_0's auc: 0.783257
[73]	valid_0's auc: 0.783257
[74]	valid_0's auc: 0.783331
[75]	valid_0's auc: 0.783258
[76]	valid_0's auc: 0.783237
[77]	valid_0's auc: 0.783262
[78]	valid_0's auc: 0.783189
[79]	valid_0's auc: 0.783448
[80]	valid_0's auc: 0.783488
[81]	valid_0's auc: 0.783661
[82]	valid_0's auc: 0.783609
[83]	valid_0's auc: 0.783617
[84]	valid_0's auc: 0.783657
[85]	valid_0's auc: 0.783607
[86]	valid_0's auc: 0.783621
[87]	valid_0's auc: 0.783634
[88]	valid_0's auc: 0.783628
[89]	valid_0's auc: 0.783654
[90]	valid_0's auc: 0.783633
[91]	valid_0's auc: 0.783581
[92]	valid_0's auc: 0.783617
[93]	valid_0's auc: 0.783681
[94]	valid_0's auc: 0.783675
[95]	valid_0's auc: 0.783676
[96]	valid_0's auc: 0.783612
[97]	valid_0's auc: 0.783626
[98]	valid_0's auc: 0.783547
[99]	valid_0's auc: 0.783576
[100]	valid_0's auc: 0.783671
[1]	valid_0's auc: 0.762215
[2]	valid_0's auc: 0.772303
[3]	valid_0's auc: 0.775269
[4]	valid_0's auc: 0.774358
[5]	valid_0's auc: 0.77429
[6]	valid_0's auc: 0.77402
[7]	valid_0's auc: 0.77402
[8]	valid_0's auc: 0.775664
[9]	valid_0's auc: 0.775678
[10]	valid_0's auc: 0.775709
[11]	valid_0's auc: 0.775825
[12]	valid_0's auc: 0.775823
[13]	valid_0's auc: 0.775947
[14]	valid_0's auc: 0.777737
[15]	valid_0's auc: 0.778991
[16]	valid_0's auc: 0.778649
[17]	valid_0's auc: 0.778267
[18]	valid_0's auc: 0.778369
[19]	valid_0's auc: 0.778369
[20]	valid_0's auc: 0.778369
[21]	valid_0's auc: 0.778406
[22]	valid_0's auc: 0.778805
[23]	valid_0's auc: 0.77886
[24]	valid_0's auc: 0.778419
[25]	valid_0's auc: 0.778975
[26]	valid_0's auc: 0.779186
[27]	valid_0's auc: 0.779731
[28]	valid_0's auc: 0.780122
[29]	valid_0's auc: 0.780195
[30]	valid_0's auc: 0.780215
[31]	valid_0's auc: 0.780063
[32]	valid_0's auc: 0.780154
[33]	valid_0's auc: 0.781266
[34]	valid_0's auc: 0.781285
[35]	valid_0's auc: 0.781159
[36]	valid_0's auc: 0.781426
[37]	valid_0's auc: 0.781279
[38]	valid_0's auc: 0.781604
[39]	valid_0's auc: 0.78167
[40]	valid_0's auc: 0.781357
[41]	valid_0's auc: 0.781469
[42]	valid_0's auc: 0.781396
[43]	valid_0's auc: 0.781686
[44]	valid_0's auc: 0.781696
[45]	valid_0's auc: 0.78168
[46]	valid_0's auc: 0.781665
[47]	valid_0's auc: 0.78171
[48]	valid_0's auc: 0.781954
[49]	valid_0's auc: 0.781862
[50]	valid_0's auc: 0.781908
[51]	valid_0's auc: 0.781837
[52]	valid_0's auc: 0.781664
[53]	valid_0's auc: 0.781803
[54]	valid_0's auc: 0.781854
[55]	valid_0's auc: 0.782152
[56]	valid_0's auc: 0.783181
[57]	valid_0's auc: 0.783197
[58]	valid_0's auc: 0.783183
[59]	valid_0's auc: 0.783246
[60]	valid_0's auc: 0.78318
[61]	valid_0's auc: 0.783197
[62]	valid_0's auc: 0.783328
[63]	valid_0's auc: 0.7834
[64]	valid_0's auc: 0.78337
[65]	valid_0's auc: 0.7834
[66]	valid_0's auc: 0.783353
[67]	valid_0's auc: 0.783377
[68]	valid_0's auc: 0.783359
[69]	valid_0's auc: 0.783296
[70]	valid_0's auc: 0.78347
[71]	valid_0's auc: 0.783464
[72]	valid_0's auc: 0.783491
[73]	valid_0's auc: 0.78345
[74]	valid_0's auc: 0.783439
[75]	valid_0's auc: 0.783415
[76]	valid_0's auc: 0.783675
[77]	valid_0's auc: 0.783755
[78]	valid_0's auc: 0.783625
[79]	valid_0's auc: 0.783749
[80]	valid_0's auc: 0.783941
[81]	valid_0's auc: 0.78379
[82]	valid_0's auc: 0.784143
[83]	valid_0's auc: 0.784113
[84]	valid_0's auc: 0.784115
[85]	valid_0's auc: 0.783958
[86]	valid_0's auc: 0.784025
[87]	valid_0's auc: 0.784022
[88]	valid_0's auc: 0.784077
[89]	valid_0's auc: 0.784074
[90]	valid_0's auc: 0.784081
[91]	valid_0's auc: 0.78425
[92]	valid_0's auc: 0.784216
[93]	valid_0's auc: 0.784472
[94]	valid_0's auc: 0.784427
[95]	valid_0's auc: 0.78431
[96]	valid_0's auc: 0.784436
[97]	valid_0's auc: 0.784476
[98]	valid_0's auc: 0.784678
[99]	valid_0's auc: 0.784664
[100]	valid_0's auc: 0.784485
[1]	valid_0's auc: 0.760477
[2]	valid_0's auc: 0.760477
[3]	valid_0's auc: 0.761049
[4]	valid_0's auc: 0.762803
[5]	valid_0's auc: 0.762803
[6]	valid_0's auc: 0.764426
[7]	valid_0's auc: 0.764426
[8]	valid_0's auc: 0.763876
[9]	valid_0's auc: 0.764608
[10]	valid_0's auc: 0.764789
[11]	valid_0's auc: 0.764761
[12]	valid_0's auc: 0.764618
[13]	valid_0's auc: 0.770857
[14]	valid_0's auc: 0.770718
[15]	valid_0's auc: 0.770783
[16]	valid_0's auc: 0.771002
[17]	valid_0's auc: 0.771467
[18]	valid_0's auc: 0.771629
[19]	valid_0's auc: 0.77172
[20]	valid_0's auc: 0.77172
[21]	valid_0's auc: 0.771722
[22]	valid_0's auc: 0.771593
[23]	valid_0's auc: 0.771624
[24]	valid_0's auc: 0.771752
[25]	valid_0's auc: 0.771876
[26]	valid_0's auc: 0.772083
[27]	valid_0's auc: 0.773213
[28]	valid_0's auc: 0.773004
[29]	valid_0's auc: 0.773093
[30]	valid_0's auc: 0.772967
[31]	valid_0's auc: 0.773117
[32]	valid_0's auc: 0.773116
[33]	valid_0's auc: 0.773506
[34]	valid_0's auc: 0.773394
[35]	valid_0's auc: 0.773326
[36]	valid_0's auc: 0.773443
[37]	valid_0's auc: 0.773701
[38]	valid_0's auc: 0.77445
[39]	valid_0's auc: 0.774493
[40]	valid_0's auc: 0.774396
[41]	valid_0's auc: 0.774179
[42]	valid_0's auc: 0.774358
[43]	valid_0's auc: 0.775014
[44]	valid_0's auc: 0.774964
[45]	valid_0's auc: 0.775465
[46]	valid_0's auc: 0.776707
[47]	valid_0's auc: 0.776664
[48]	valid_0's auc: 0.776713
[49]	valid_0's auc: 0.77668
[50]	valid_0's auc: 0.777371
[51]	valid_0's auc: 0.777236
[52]	valid_0's auc: 0.77733
[53]	valid_0's auc: 0.777753
[54]	valid_0's auc: 0.777805
[55]	valid_0's auc: 0.777714
[56]	valid_0's auc: 0.778697
[57]	valid_0's auc: 0.778794
[58]	valid_0's auc: 0.778777
[59]	valid_0's auc: 0.77882
[60]	valid_0's auc: 0.779645
[61]	valid_0's auc: 0.779869
[62]	valid_0's auc: 0.779814
[63]	valid_0's auc: 0.779875
[64]	valid_0's auc: 0.779835
[65]	valid_0's auc: 0.779921
[66]	valid_0's auc: 0.779885
[67]	valid_0's auc: 0.779928
[68]	valid_0's auc: 0.779884
[69]	valid_0's auc: 0.779864
[70]	valid_0's auc: 0.779671
[71]	valid_0's auc: 0.779628
[72]	valid_0's auc: 0.7797
[73]	valid_0's auc: 0.779669
[74]	valid_0's auc: 0.779662
[75]	valid_0's auc: 0.779602
[76]	valid_0's auc: 0.779586
[77]	valid_0's auc: 0.779785
[78]	valid_0's auc: 0.779903
[79]	valid_0's auc: 0.780292
[80]	valid_0's auc: 0.780547
[81]	valid_0's auc: 0.780595
[82]	valid_0's auc: 0.780638
[83]	valid_0's auc: 0.7806
[84]	valid_0's auc: 0.78062
[85]	valid_0's auc: 0.781164
[86]	valid_0's auc: 0.78102
[87]	valid_0's auc: 0.781019
[88]	valid_0's auc: 0.781117
[89]	valid_0's auc: 0.781135
[90]	valid_0's auc: 0.78102
[91]	valid_0's auc: 0.781003
[92]	valid_0's auc: 0.780928
[93]	valid_0's auc: 0.781373
[94]	valid_0's auc: 0.781373
[95]	valid_0's auc: 0.781288
[96]	valid_0's auc: 0.781271
[97]	valid_0's auc: 0.781279
[98]	valid_0's auc: 0.781668
[99]	valid_0's auc: 0.781582
[100]	valid_0's auc: 0.781647
[1]	valid_0's auc: 0.7495
[2]	valid_0's auc: 0.7495
[3]	valid_0's auc: 0.7495
[4]	valid_0's auc: 0.751338
[5]	valid_0's auc: 0.751338
[6]	valid_0's auc: 0.750327
[7]	valid_0's auc: 0.750327
[8]	valid_0's auc: 0.750327
[9]	valid_0's auc: 0.753589
[10]	valid_0's auc: 0.751663
[11]	valid_0's auc: 0.751663
[12]	valid_0's auc: 0.751663
[13]	valid_0's auc: 0.753629
[14]	valid_0's auc: 0.751667
[15]	valid_0's auc: 0.753799
[16]	valid_0's auc: 0.75409
[17]	valid_0's auc: 0.75437
[18]	valid_0's auc: 0.754555
[19]	valid_0's auc: 0.754555
[20]	valid_0's auc: 0.75437
[21]	valid_0's auc: 0.75454
[22]	valid_0's auc: 0.754536
[23]	valid_0's auc: 0.754532
[24]	valid_0's auc: 0.754602
[25]	valid_0's auc: 0.760514
[26]	valid_0's auc: 0.760553
[27]	valid_0's auc: 0.760625
[28]	valid_0's auc: 0.76055
[29]	valid_0's auc: 0.760601
[30]	valid_0's auc: 0.760469
[31]	valid_0's auc: 0.76243
[32]	valid_0's auc: 0.762366
[33]	valid_0's auc: 0.76231
[34]	valid_0's auc: 0.762278
[35]	valid_0's auc: 0.762393
[36]	valid_0's auc: 0.762182
[37]	valid_0's auc: 0.762012
[38]	valid_0's auc: 0.762041
[39]	valid_0's auc: 0.762299
[40]	valid_0's auc: 0.762127
[41]	valid_0's auc: 0.762302
[42]	valid_0's auc: 0.762337
[43]	valid_0's auc: 0.76231
[44]	valid_0's auc: 0.762147
[45]	valid_0's auc: 0.76238
[46]	valid_0's auc: 0.762362
[47]	valid_0's auc: 0.7624
[48]	valid_0's auc: 0.762364
[49]	valid_0's auc: 0.762933
[50]	valid_0's auc: 0.763672
[51]	valid_0's auc: 0.764058
[52]	valid_0's auc: 0.764455
[53]	valid_0's auc: 0.764269
[54]	valid_0's auc: 0.766175
[55]	valid_0's auc: 0.766913
[56]	valid_0's auc: 0.767065
[57]	valid_0's auc: 0.766834
[58]	valid_0's auc: 0.766894
[59]	valid_0's auc: 0.766915
[60]	valid_0's auc: 0.766971
[61]	valid_0's auc: 0.766952
[62]	valid_0's auc: 0.767665
[63]	valid_0's auc: 0.767655
[64]	valid_0's auc: 0.767635
[65]	valid_0's auc: 0.767597
[66]	valid_0's auc: 0.767546
[67]	valid_0's auc: 0.767567
[68]	valid_0's auc: 0.76756
[69]	valid_0's auc: 0.767479
[70]	valid_0's auc: 0.767496
[71]	valid_0's auc: 0.7675
[72]	valid_0's auc: 0.767763
[73]	valid_0's auc: 0.768087
[74]	valid_0's auc: 0.76767
[75]	valid_0's auc: 0.76772
[76]	valid_0's auc: 0.767691
[77]	valid_0's auc: 0.767619
[78]	valid_0's auc: 0.767665
[79]	valid_0's auc: 0.768263
[80]	valid_0's auc: 0.768021
[81]	valid_0's auc: 0.768156
[82]	valid_0's auc: 0.769091
[83]	valid_0's auc: 0.769085
[84]	valid_0's auc: 0.769005
[85]	valid_0's auc: 0.769298
[86]	valid_0's auc: 0.769249
[87]	valid_0's auc: 0.769219
[88]	valid_0's auc: 0.769183
[89]	valid_0's auc: 0.769129
[90]	valid_0's auc: 0.769701
[91]	valid_0's auc: 0.769705
[92]	valid_0's auc: 0.769733
[93]	valid_0's auc: 0.769842
[94]	valid_0's auc: 0.769559
[95]	valid_0's auc: 0.769506
[96]	valid_0's auc: 0.769805
[97]	valid_0's auc: 0.769775
[98]	valid_0's auc: 0.769843
[99]	valid_0's auc: 0.769828
[100]	valid_0's auc: 0.769725
[1]	valid_0's auc: 0.748453
[2]	valid_0's auc: 0.750923
[3]	valid_0's auc: 0.75159
[4]	valid_0's auc: 0.752015
[5]	valid_0's auc: 0.756456
[6]	valid_0's auc: 0.757053
[7]	valid_0's auc: 0.757053
[8]	valid_0's auc: 0.757245
[9]	valid_0's auc: 0.756643
[10]	valid_0's auc: 0.756789
[11]	valid_0's auc: 0.756791
[12]	valid_0's auc: 0.757546
[13]	valid_0's auc: 0.757041
[14]	valid_0's auc: 0.757916
[15]	valid_0's auc: 0.758323
[16]	valid_0's auc: 0.762649
[17]	valid_0's auc: 0.763888
[18]	valid_0's auc: 0.763851
[19]	valid_0's auc: 0.7643
[20]	valid_0's auc: 0.7643
[21]	valid_0's auc: 0.764526
[22]	valid_0's auc: 0.765621
[23]	valid_0's auc: 0.765736
[24]	valid_0's auc: 0.765354
[25]	valid_0's auc: 0.765948
[26]	valid_0's auc: 0.765554
[27]	valid_0's auc: 0.767708
[28]	valid_0's auc: 0.768403
[29]	valid_0's auc: 0.768837
[30]	valid_0's auc: 0.767932
[31]	valid_0's auc: 0.769135
[32]	valid_0's auc: 0.768762
[33]	valid_0's auc: 0.769205
[34]	valid_0's auc: 0.769184
[35]	valid_0's auc: 0.769103
[36]	valid_0's auc: 0.769351
[37]	valid_0's auc: 0.769706
[38]	valid_0's auc: 0.770011
[39]	valid_0's auc: 0.769793
[40]	valid_0's auc: 0.769716
[41]	valid_0's auc: 0.769734
[42]	valid_0's auc: 0.769777
[43]	valid_0's auc: 0.769948
[44]	valid_0's auc: 0.769965
[45]	valid_0's auc: 0.769943
[46]	valid_0's auc: 0.770255
[47]	valid_0's auc: 0.770166
[48]	valid_0's auc: 0.770441
[49]	valid_0's auc: 0.770272
[50]	valid_0's auc: 0.77048
[51]	valid_0's auc: 0.770307
[52]	valid_0's auc: 0.77023
[53]	valid_0's auc: 0.770333
[54]	valid_0's auc: 0.770719
[55]	valid_0's auc: 0.770651
[56]	valid_0's auc: 0.770888
[57]	valid_0's auc: 0.770661
[58]	valid_0's auc: 0.770679
[59]	valid_0's auc: 0.770734
[60]	valid_0's auc: 0.770969
[61]	valid_0's auc: 0.770957
[62]	valid_0's auc: 0.770821
[63]	valid_0's auc: 0.770921
[64]	valid_0's auc: 0.77134
[65]	valid_0's auc: 0.771506
[66]	valid_0's auc: 0.771477
[67]	valid_0's auc: 0.771467
[68]	valid_0's auc: 0.771463
[69]	valid_0's auc: 0.771513
[70]	valid_0's auc: 0.771571
[71]	valid_0's auc: 0.77151
[72]	valid_0's auc: 0.77134
[73]	valid_0's auc: 0.771352
[74]	valid_0's auc: 0.771424
[75]	valid_0's auc: 0.771336
[76]	valid_0's auc: 0.771164
[77]	valid_0's auc: 0.771425
[78]	valid_0's auc: 0.771437
[79]	valid_0's auc: 0.771328
[80]	valid_0's auc: 0.771823
[81]	valid_0's auc: 0.771911
[82]	valid_0's auc: 0.771851
[83]	valid_0's auc: 0.771943
[84]	valid_0's auc: 0.77203
[85]	valid_0's auc: 0.771763
[86]	valid_0's auc: 0.771761
[87]	valid_0's auc: 0.771757
[88]	valid_0's auc: 0.771803
[89]	valid_0's auc: 0.77181
[90]	valid_0's auc: 0.771788
[91]	valid_0's auc: 0.771799
[92]	valid_0's auc: 0.771791
[93]	valid_0's auc: 0.772506
[94]	valid_0's auc: 0.772504
[95]	valid_0's auc: 0.772567
[96]	valid_0's auc: 0.7724
[97]	valid_0's auc: 0.772422
[98]	valid_0's auc: 0.772325
[99]	valid_0's auc: 0.77207
[100]	valid_0's auc: 0.772598
[1]	valid_0's auc: 0.748867
[2]	valid_0's auc: 0.764626
[3]	valid_0's auc: 0.764062
[4]	valid_0's auc: 0.763174
[5]	valid_0's auc: 0.763174
[6]	valid_0's auc: 0.762373
[7]	valid_0's auc: 0.762373
[8]	valid_0's auc: 0.762362
[9]	valid_0's auc: 0.76208
[10]	valid_0's auc: 0.764271
[11]	valid_0's auc: 0.764385
[12]	valid_0's auc: 0.764821
[13]	valid_0's auc: 0.765442
[14]	valid_0's auc: 0.765184
[15]	valid_0's auc: 0.765028
[16]	valid_0's auc: 0.764415
[17]	valid_0's auc: 0.765623
[18]	valid_0's auc: 0.766556
[19]	valid_0's auc: 0.766645
[20]	valid_0's auc: 0.766614
[21]	valid_0's auc: 0.766746
[22]	valid_0's auc: 0.7666
[23]	valid_0's auc: 0.7666
[24]	valid_0's auc: 0.76649
[25]	valid_0's auc: 0.767107
[26]	valid_0's auc: 0.767174
[27]	valid_0's auc: 0.768623
[28]	valid_0's auc: 0.768909
[29]	valid_0's auc: 0.768838
[30]	valid_0's auc: 0.768888
[31]	valid_0's auc: 0.768729
[32]	valid_0's auc: 0.768839
[33]	valid_0's auc: 0.76912
[34]	valid_0's auc: 0.769078
[35]	valid_0's auc: 0.76903
[36]	valid_0's auc: 0.768857
[37]	valid_0's auc: 0.768625
[38]	valid_0's auc: 0.769566
[39]	valid_0's auc: 0.769595
[40]	valid_0's auc: 0.769485
[41]	valid_0's auc: 0.769462
[42]	valid_0's auc: 0.769607
[43]	valid_0's auc: 0.769984
[44]	valid_0's auc: 0.77014
[45]	valid_0's auc: 0.769858
[46]	valid_0's auc: 0.770436
[47]	valid_0's auc: 0.770434
[48]	valid_0's auc: 0.770289
[49]	valid_0's auc: 0.770654
[50]	valid_0's auc: 0.769558
[51]	valid_0's auc: 0.769724
[52]	valid_0's auc: 0.769717
[53]	valid_0's auc: 0.76963
[54]	valid_0's auc: 0.770038
[55]	valid_0's auc: 0.771136
[56]	valid_0's auc: 0.771297
[57]	valid_0's auc: 0.771245
[58]	valid_0's auc: 0.771251
[59]	valid_0's auc: 0.771149
[60]	valid_0's auc: 0.77153
[61]	valid_0's auc: 0.77158
[62]	valid_0's auc: 0.771379
[63]	valid_0's auc: 0.771845
[64]	valid_0's auc: 0.771886
[65]	valid_0's auc: 0.771767
[66]	valid_0's auc: 0.77168
[67]	valid_0's auc: 0.771708
[68]	valid_0's auc: 0.771782
[69]	valid_0's auc: 0.771657
[70]	valid_0's auc: 0.771589
[71]	valid_0's auc: 0.771596
[72]	valid_0's auc: 0.771916
[73]	valid_0's auc: 0.771913
[74]	valid_0's auc: 0.771938
[75]	valid_0's auc: 0.77189
[76]	valid_0's auc: 0.771804
[77]	valid_0's auc: 0.771921
[78]	valid_0's auc: 0.772233
[79]	valid_0's auc: 0.772134
[80]	valid_0's auc: 0.772308
[81]	valid_0's auc: 0.772456
[82]	valid_0's auc: 0.772534
[83]	valid_0's auc: 0.772563
[84]	valid_0's auc: 0.77248
[85]	valid_0's auc: 0.773027
[86]	valid_0's auc: 0.773028
[87]	valid_0's auc: 0.773038
[88]	valid_0's auc: 0.773018
[89]	valid_0's auc: 0.773081
[90]	valid_0's auc: 0.773037
[91]	valid_0's auc: 0.77304
[92]	valid_0's auc: 0.773228
[93]	valid_0's auc: 0.773083
[94]	valid_0's auc: 0.77306
[95]	valid_0's auc: 0.773082
[96]	valid_0's auc: 0.773302
[97]	valid_0's auc: 0.773207
[98]	valid_0's auc: 0.773282
[99]	valid_0's auc: 0.773215
[100]	valid_0's auc: 0.773306
[1]	valid_0's auc: 0.748286
[2]	valid_0's auc: 0.748643
[3]	valid_0's auc: 0.748974
[4]	valid_0's auc: 0.749498
[5]	valid_0's auc: 0.749498
[6]	valid_0's auc: 0.7492
[7]	valid_0's auc: 0.7492
[8]	valid_0's auc: 0.752539
[9]	valid_0's auc: 0.762203
[10]	valid_0's auc: 0.762197
[11]	valid_0's auc: 0.762188
[12]	valid_0's auc: 0.762281
[13]	valid_0's auc: 0.762658
[14]	valid_0's auc: 0.762467
[15]	valid_0's auc: 0.764294
[16]	valid_0's auc: 0.765254
[17]	valid_0's auc: 0.765911
[18]	valid_0's auc: 0.765918
[19]	valid_0's auc: 0.765884
[20]	valid_0's auc: 0.765884
[21]	valid_0's auc: 0.76579
[22]	valid_0's auc: 0.765663
[23]	valid_0's auc: 0.76566
[24]	valid_0's auc: 0.765605
[25]	valid_0's auc: 0.766181
[26]	valid_0's auc: 0.766276
[27]	valid_0's auc: 0.76655
[28]	valid_0's auc: 0.766088
[29]	valid_0's auc: 0.766655
[30]	valid_0's auc: 0.766075
[31]	valid_0's auc: 0.767689
[32]	valid_0's auc: 0.767668
[33]	valid_0's auc: 0.767533
[34]	valid_0's auc: 0.767118
[35]	valid_0's auc: 0.767054
[36]	valid_0's auc: 0.766499
[37]	valid_0's auc: 0.767457
[38]	valid_0's auc: 0.767132
[39]	valid_0's auc: 0.76716
[40]	valid_0's auc: 0.767097
[41]	valid_0's auc: 0.766949
[42]	valid_0's auc: 0.76775
[43]	valid_0's auc: 0.768144
[44]	valid_0's auc: 0.768126
[45]	valid_0's auc: 0.767837
[46]	valid_0's auc: 0.768498
[47]	valid_0's auc: 0.768361
[48]	valid_0's auc: 0.768177
[49]	valid_0's auc: 0.768134
[50]	valid_0's auc: 0.768357
[51]	valid_0's auc: 0.768274
[52]	valid_0's auc: 0.768282
[53]	valid_0's auc: 0.769519
[54]	valid_0's auc: 0.769432
[55]	valid_0's auc: 0.769116
[56]	valid_0's auc: 0.769951
[57]	valid_0's auc: 0.769814
[58]	valid_0's auc: 0.769683
[59]	valid_0's auc: 0.769728
[60]	valid_0's auc: 0.769657
[61]	valid_0's auc: 0.769657
[62]	valid_0's auc: 0.770357
[63]	valid_0's auc: 0.770351
[64]	valid_0's auc: 0.770457
[65]	valid_0's auc: 0.770513
[66]	valid_0's auc: 0.770786
[67]	valid_0's auc: 0.770685
[68]	valid_0's auc: 0.770703
[69]	valid_0's auc: 0.770468
[70]	valid_0's auc: 0.770401
[71]	valid_0's auc: 0.770392
[72]	valid_0's auc: 0.770924
[73]	valid_0's auc: 0.770897
[74]	valid_0's auc: 0.770976
[75]	valid_0's auc: 0.770904
[76]	valid_0's auc: 0.770828
[77]	valid_0's auc: 0.770694
[78]	valid_0's auc: 0.771128
[79]	valid_0's auc: 0.770851
[80]	valid_0's auc: 0.770718
[81]	valid_0's auc: 0.771544
[82]	valid_0's auc: 0.771195
[83]	valid_0's auc: 0.771197
[84]	valid_0's auc: 0.771199
[85]	valid_0's auc: 0.77162
[86]	valid_0's auc: 0.771434
[87]	valid_0's auc: 0.771354
[88]	valid_0's auc: 0.771411
[89]	valid_0's auc: 0.771386
[90]	valid_0's auc: 0.771444
[91]	valid_0's auc: 0.771538
[92]	valid_0's auc: 0.771342
[93]	valid_0's auc: 0.771303
[94]	valid_0's auc: 0.771238
[95]	valid_0's auc: 0.771726
[96]	valid_0's auc: 0.771687
[97]	valid_0's auc: 0.771616
[98]	valid_0's auc: 0.771642
[99]	valid_0's auc: 0.771567
[100]	valid_0's auc: 0.771987
[1]	valid_0's auc: 0.757043
[2]	valid_0's auc: 0.757043
[3]	valid_0's auc: 0.757043
[4]	valid_0's auc: 0.757043
[5]	valid_0's auc: 0.757043
[6]	valid_0's auc: 0.757043
[7]	valid_0's auc: 0.757043
[8]	valid_0's auc: 0.757043
[9]	valid_0's auc: 0.758795
[10]	valid_0's auc: 0.760475
[11]	valid_0's auc: 0.760597
[12]	valid_0's auc: 0.760584
[13]	valid_0's auc: 0.760681
[14]	valid_0's auc: 0.760683
[15]	valid_0's auc: 0.76056
[16]	valid_0's auc: 0.761402
[17]	valid_0's auc: 0.761529
[18]	valid_0's auc: 0.761529
[19]	valid_0's auc: 0.761591
[20]	valid_0's auc: 0.761595
[21]	valid_0's auc: 0.761558
[22]	valid_0's auc: 0.761546
[23]	valid_0's auc: 0.761522
[24]	valid_0's auc: 0.761983
[25]	valid_0's auc: 0.762505
[26]	valid_0's auc: 0.762508
[27]	valid_0's auc: 0.761965
[28]	valid_0's auc: 0.762515
[29]	valid_0's auc: 0.762438
[30]	valid_0's auc: 0.761944
[31]	valid_0's auc: 0.762352
[32]	valid_0's auc: 0.762362
[33]	valid_0's auc: 0.76204
[34]	valid_0's auc: 0.762014
[35]	valid_0's auc: 0.762034
[36]	valid_0's auc: 0.762299
[37]	valid_0's auc: 0.76202
[38]	valid_0's auc: 0.762389
[39]	valid_0's auc: 0.762389
[40]	valid_0's auc: 0.76257
[41]	valid_0's auc: 0.762487
[42]	valid_0's auc: 0.768379
[43]	valid_0's auc: 0.768593
[44]	valid_0's auc: 0.768641
[45]	valid_0's auc: 0.76864
[46]	valid_0's auc: 0.768701
[47]	valid_0's auc: 0.768685
[48]	valid_0's auc: 0.768725
[49]	valid_0's auc: 0.76878
[50]	valid_0's auc: 0.768815
[51]	valid_0's auc: 0.768899
[52]	valid_0's auc: 0.769356
[53]	valid_0's auc: 0.76957
[54]	valid_0's auc: 0.769592
[55]	valid_0's auc: 0.770163
[56]	valid_0's auc: 0.770087
[57]	valid_0's auc: 0.769987
[58]	valid_0's auc: 0.770034
[59]	valid_0's auc: 0.770034
[60]	valid_0's auc: 0.770112
[61]	valid_0's auc: 0.770101
[62]	valid_0's auc: 0.77021
[63]	valid_0's auc: 0.770568
[64]	valid_0's auc: 0.770577
[65]	valid_0's auc: 0.770507
[66]	valid_0's auc: 0.770555
[67]	valid_0's auc: 0.770527
[68]	valid_0's auc: 0.770536
[69]	valid_0's auc: 0.770528
[70]	valid_0's auc: 0.770708
[71]	valid_0's auc: 0.770806
[72]	valid_0's auc: 0.770551
[73]	valid_0's auc: 0.770486
[74]	valid_0's auc: 0.770564
[75]	valid_0's auc: 0.770553
[76]	valid_0's auc: 0.770568
[77]	valid_0's auc: 0.771518
[78]	valid_0's auc: 0.772063
[79]	valid_0's auc: 0.772278
[80]	valid_0's auc: 0.772168
[81]	valid_0's auc: 0.772335
[82]	valid_0's auc: 0.772699
[83]	valid_0's auc: 0.772683
[84]	valid_0's auc: 0.772628
[85]	valid_0's auc: 0.772676
[86]	valid_0's auc: 0.772657
[87]	valid_0's auc: 0.772683
[88]	valid_0's auc: 0.772761
[89]	valid_0's auc: 0.772677
[90]	valid_0's auc: 0.772599
[91]	valid_0's auc: 0.772597
[92]	valid_0's auc: 0.772742
[93]	valid_0's auc: 0.772798
[94]	valid_0's auc: 0.772787
[95]	valid_0's auc: 0.772724
[96]	valid_0's auc: 0.773257
[97]	valid_0's auc: 0.773116
[98]	valid_0's auc: 0.773239
[99]	valid_0's auc: 0.773173
[100]	valid_0's auc: 0.772973
[1]	valid_0's auc: 0.752298
[2]	valid_0's auc: 0.759276
[3]	valid_0's auc: 0.759067
[4]	valid_0's auc: 0.760567
[5]	valid_0's auc: 0.760567
[6]	valid_0's auc: 0.759938
[7]	valid_0's auc: 0.759938
[8]	valid_0's auc: 0.760226
[9]	valid_0's auc: 0.762248
[10]	valid_0's auc: 0.76367
[11]	valid_0's auc: 0.763621
[12]	valid_0's auc: 0.763836
[13]	valid_0's auc: 0.764138
[14]	valid_0's auc: 0.763808
[15]	valid_0's auc: 0.763779
[16]	valid_0's auc: 0.764396
[17]	valid_0's auc: 0.76465
[18]	valid_0's auc: 0.764675
[19]	valid_0's auc: 0.764672
[20]	valid_0's auc: 0.764672
[21]	valid_0's auc: 0.764679
[22]	valid_0's auc: 0.764654
[23]	valid_0's auc: 0.764654
[24]	valid_0's auc: 0.764862
[25]	valid_0's auc: 0.767665
[26]	valid_0's auc: 0.767657
[27]	valid_0's auc: 0.76752
[28]	valid_0's auc: 0.768079
[29]	valid_0's auc: 0.767723
[30]	valid_0's auc: 0.767753
[31]	valid_0's auc: 0.767904
[32]	valid_0's auc: 0.767735
[33]	valid_0's auc: 0.767987
[34]	valid_0's auc: 0.767913
[35]	valid_0's auc: 0.767958
[36]	valid_0's auc: 0.767726
[37]	valid_0's auc: 0.77144
[38]	valid_0's auc: 0.771983
[39]	valid_0's auc: 0.772072
[40]	valid_0's auc: 0.772642
[41]	valid_0's auc: 0.7724
[42]	valid_0's auc: 0.772606
[43]	valid_0's auc: 0.773112
[44]	valid_0's auc: 0.773262
[45]	valid_0's auc: 0.773283
[46]	valid_0's auc: 0.7736
[47]	valid_0's auc: 0.773597
[48]	valid_0's auc: 0.774146
[49]	valid_0's auc: 0.77475
[50]	valid_0's auc: 0.775632
[51]	valid_0's auc: 0.775868
[52]	valid_0's auc: 0.776302
[53]	valid_0's auc: 0.776677
[54]	valid_0's auc: 0.776795
[55]	valid_0's auc: 0.777018
[56]	valid_0's auc: 0.776968
[57]	valid_0's auc: 0.7774
[58]	valid_0's auc: 0.77728
[59]	valid_0's auc: 0.777279
[60]	valid_0's auc: 0.777589
[61]	valid_0's auc: 0.777624
[62]	valid_0's auc: 0.777569
[63]	valid_0's auc: 0.777958
[64]	valid_0's auc: 0.778038
[65]	valid_0's auc: 0.777877
[66]	valid_0's auc: 0.777922
[67]	valid_0's auc: 0.777913
[68]	valid_0's auc: 0.777988
[69]	valid_0's auc: 0.778129
[70]	valid_0's auc: 0.778129
[71]	valid_0's auc: 0.778101
[72]	valid_0's auc: 0.778072
[73]	valid_0's auc: 0.778109
[74]	valid_0's auc: 0.778116
[75]	valid_0's auc: 0.778077
[76]	valid_0's auc: 0.778073
[77]	valid_0's auc: 0.778366
[78]	valid_0's auc: 0.778557
[79]	valid_0's auc: 0.778774
[80]	valid_0's auc: 0.778649
[81]	valid_0's auc: 0.778791
[82]	valid_0's auc: 0.779486
[83]	valid_0's auc: 0.779434
[84]	valid_0's auc: 0.779476
[85]	valid_0's auc: 0.779434
[86]	valid_0's auc: 0.779464
[87]	valid_0's auc: 0.779465
[88]	valid_0's auc: 0.779408
[89]	valid_0's auc: 0.779419
[90]	valid_0's auc: 0.779439
[91]	valid_0's auc: 0.779496
[92]	valid_0's auc: 0.779354
[93]	valid_0's auc: 0.779674
[94]	valid_0's auc: 0.779704
[95]	valid_0's auc: 0.779594
[96]	valid_0's auc: 0.779564
[97]	valid_0's auc: 0.779641
[98]	valid_0's auc: 0.779724
[99]	valid_0's auc: 0.779738
[100]	valid_0's auc: 0.779794
[1]	valid_0's auc: 0.754636
[2]	valid_0's auc: 0.768299
[3]	valid_0's auc: 0.771575
[4]	valid_0's auc: 0.770665
[5]	valid_0's auc: 0.770665
[6]	valid_0's auc: 0.772371
[7]	valid_0's auc: 0.772371
[8]	valid_0's auc: 0.772084
[9]	valid_0's auc: 0.771631
[10]	valid_0's auc: 0.771184
[11]	valid_0's auc: 0.771361
[12]	valid_0's auc: 0.771364
[13]	valid_0's auc: 0.771236
[14]	valid_0's auc: 0.77209
[15]	valid_0's auc: 0.771527
[16]	valid_0's auc: 0.773395
[17]	valid_0's auc: 0.772813
[18]	valid_0's auc: 0.772593
[19]	valid_0's auc: 0.772593
[20]	valid_0's auc: 0.772593
[21]	valid_0's auc: 0.772756
[22]	valid_0's auc: 0.772602
[23]	valid_0's auc: 0.77261
[24]	valid_0's auc: 0.772405
[25]	valid_0's auc: 0.772337
[26]	valid_0's auc: 0.773139
[27]	valid_0's auc: 0.774044
[28]	valid_0's auc: 0.774584
[29]	valid_0's auc: 0.774653
[30]	valid_0's auc: 0.774633
[31]	valid_0's auc: 0.775546
[32]	valid_0's auc: 0.775245
[33]	valid_0's auc: 0.775
[34]	valid_0's auc: 0.774977
[35]	valid_0's auc: 0.774951
[36]	valid_0's auc: 0.774779
[37]	valid_0's auc: 0.774521
[38]	valid_0's auc: 0.775045
[39]	valid_0's auc: 0.775058
[40]	valid_0's auc: 0.774883
[41]	valid_0's auc: 0.774926
[42]	valid_0's auc: 0.775126
[43]	valid_0's auc: 0.775779
[44]	valid_0's auc: 0.77582
[45]	valid_0's auc: 0.775998
[46]	valid_0's auc: 0.775977
[47]	valid_0's auc: 0.775857
[48]	valid_0's auc: 0.775963
[49]	valid_0's auc: 0.77571
[50]	valid_0's auc: 0.77598
[51]	valid_0's auc: 0.776138
[52]	valid_0's auc: 0.776229
[53]	valid_0's auc: 0.776544
[54]	valid_0's auc: 0.777067
[55]	valid_0's auc: 0.776815
[56]	valid_0's auc: 0.776867
[57]	valid_0's auc: 0.777088
[58]	valid_0's auc: 0.77714
[59]	valid_0's auc: 0.777164
[60]	valid_0's auc: 0.777809
[61]	valid_0's auc: 0.777681
[62]	valid_0's auc: 0.777586
[63]	valid_0's auc: 0.777788
[64]	valid_0's auc: 0.777767
[65]	valid_0's auc: 0.778097
[66]	valid_0's auc: 0.77814
[67]	valid_0's auc: 0.778187
[68]	valid_0's auc: 0.778159
[69]	valid_0's auc: 0.778054
[70]	valid_0's auc: 0.778241
[71]	valid_0's auc: 0.778186
[72]	valid_0's auc: 0.778181
[73]	valid_0's auc: 0.778076
[74]	valid_0's auc: 0.778134
[75]	valid_0's auc: 0.778119
[76]	valid_0's auc: 0.778214
[77]	valid_0's auc: 0.778572
[78]	valid_0's auc: 0.779137
[79]	valid_0's auc: 0.778959
[80]	valid_0's auc: 0.77866
[81]	valid_0's auc: 0.779315
[82]	valid_0's auc: 0.779205
[83]	valid_0's auc: 0.77921
[84]	valid_0's auc: 0.779256
[85]	valid_0's auc: 0.778944
[86]	valid_0's auc: 0.778859
[87]	valid_0's auc: 0.77897
[88]	valid_0's auc: 0.779201
[89]	valid_0's auc: 0.779222
[90]	valid_0's auc: 0.779197
[91]	valid_0's auc: 0.779267
[92]	valid_0's auc: 0.778952
[93]	valid_0's auc: 0.778946
[94]	valid_0's auc: 0.778927
[95]	valid_0's auc: 0.779292
[96]	valid_0's auc: 0.77937
[97]	valid_0's auc: 0.779374
[98]	valid_0's auc: 0.779003
[99]	valid_0's auc: 0.779013
[100]	valid_0's auc: 0.778886
[1]	valid_0's auc: 0.757698
[2]	valid_0's auc: 0.757948
[3]	valid_0's auc: 0.758047
[4]	valid_0's auc: 0.758753
[5]	valid_0's auc: 0.758732
[6]	valid_0's auc: 0.761583
[7]	valid_0's auc: 0.761583
[8]	valid_0's auc: 0.761034
[9]	valid_0's auc: 0.762049
[10]	valid_0's auc: 0.762074
[11]	valid_0's auc: 0.762198
[12]	valid_0's auc: 0.762199
[13]	valid_0's auc: 0.767967
[14]	valid_0's auc: 0.767981
[15]	valid_0's auc: 0.767899
[16]	valid_0's auc: 0.767954
[17]	valid_0's auc: 0.769467
[18]	valid_0's auc: 0.769501
[19]	valid_0's auc: 0.769247
[20]	valid_0's auc: 0.769247
[21]	valid_0's auc: 0.769322
[22]	valid_0's auc: 0.769347
[23]	valid_0's auc: 0.769282
[24]	valid_0's auc: 0.769233
[25]	valid_0's auc: 0.769509
[26]	valid_0's auc: 0.769618
[27]	valid_0's auc: 0.770013
[28]	valid_0's auc: 0.770055
[29]	valid_0's auc: 0.770067
[30]	valid_0's auc: 0.76999
[31]	valid_0's auc: 0.770295
[32]	valid_0's auc: 0.770246
[33]	valid_0's auc: 0.770121
[34]	valid_0's auc: 0.770037
[35]	valid_0's auc: 0.770074
[36]	valid_0's auc: 0.770297
[37]	valid_0's auc: 0.77025
[38]	valid_0's auc: 0.77052
[39]	valid_0's auc: 0.770531
[40]	valid_0's auc: 0.770241
[41]	valid_0's auc: 0.770249
[42]	valid_0's auc: 0.770513
[43]	valid_0's auc: 0.771491
[44]	valid_0's auc: 0.771462
[45]	valid_0's auc: 0.771853
[46]	valid_0's auc: 0.771821
[47]	valid_0's auc: 0.771835
[48]	valid_0's auc: 0.772956
[49]	valid_0's auc: 0.772811
[50]	valid_0's auc: 0.772768
[51]	valid_0's auc: 0.773013
[52]	valid_0's auc: 0.773168
[53]	valid_0's auc: 0.772969
[54]	valid_0's auc: 0.773559
[55]	valid_0's auc: 0.773395
[56]	valid_0's auc: 0.773962
[57]	valid_0's auc: 0.773393
[58]	valid_0's auc: 0.773411
[59]	valid_0's auc: 0.773418
[60]	valid_0's auc: 0.773612
[61]	valid_0's auc: 0.773587
[62]	valid_0's auc: 0.774549
[63]	valid_0's auc: 0.775026
[64]	valid_0's auc: 0.775062
[65]	valid_0's auc: 0.775142
[66]	valid_0's auc: 0.775147
[67]	valid_0's auc: 0.775173
[68]	valid_0's auc: 0.775176
[69]	valid_0's auc: 0.774918
[70]	valid_0's auc: 0.775034
[71]	valid_0's auc: 0.77502
[72]	valid_0's auc: 0.775638
[73]	valid_0's auc: 0.775681
[74]	valid_0's auc: 0.775659
[75]	valid_0's auc: 0.775646
[76]	valid_0's auc: 0.775464
[77]	valid_0's auc: 0.775468
[78]	valid_0's auc: 0.776185
[79]	valid_0's auc: 0.776177
[80]	valid_0's auc: 0.776149
[81]	valid_0's auc: 0.776095
[82]	valid_0's auc: 0.77591
[83]	valid_0's auc: 0.77598
[84]	valid_0's auc: 0.77593
[85]	valid_0's auc: 0.776513
[86]	valid_0's auc: 0.776489
[87]	valid_0's auc: 0.776508
[88]	valid_0's auc: 0.776496
[89]	valid_0's auc: 0.776483
[90]	valid_0's auc: 0.776364
[91]	valid_0's auc: 0.776394
[92]	valid_0's auc: 0.776189
[93]	valid_0's auc: 0.776245
[94]	valid_0's auc: 0.776325
[95]	valid_0's auc: 0.776434
[96]	valid_0's auc: 0.776475
[97]	valid_0's auc: 0.776512
[98]	valid_0's auc: 0.776745
[99]	valid_0's auc: 0.776682
[100]	valid_0's auc: 0.776691
[1]	valid_0's auc: 0.760154
[2]	valid_0's auc: 0.760161
[3]	valid_0's auc: 0.761275
[4]	valid_0's auc: 0.761194
[5]	valid_0's auc: 0.761194
[6]	valid_0's auc: 0.761366
[7]	valid_0's auc: 0.761366
[8]	valid_0's auc: 0.761278
[9]	valid_0's auc: 0.761437
[10]	valid_0's auc: 0.763737
[11]	valid_0's auc: 0.763776
[12]	valid_0's auc: 0.763719
[13]	valid_0's auc: 0.76376
[14]	valid_0's auc: 0.765324
[15]	valid_0's auc: 0.764771
[16]	valid_0's auc: 0.765004
[17]	valid_0's auc: 0.765143
[18]	valid_0's auc: 0.765139
[19]	valid_0's auc: 0.766475
[20]	valid_0's auc: 0.766475
[21]	valid_0's auc: 0.766487
[22]	valid_0's auc: 0.766607
[23]	valid_0's auc: 0.766948
[24]	valid_0's auc: 0.766969
[25]	valid_0's auc: 0.767293
[26]	valid_0's auc: 0.76715
[27]	valid_0's auc: 0.767735
[28]	valid_0's auc: 0.767482
[29]	valid_0's auc: 0.767415
[30]	valid_0's auc: 0.767373
[31]	valid_0's auc: 0.767466
[32]	valid_0's auc: 0.767447
[33]	valid_0's auc: 0.767597
[34]	valid_0's auc: 0.767592
[35]	valid_0's auc: 0.767583
[36]	valid_0's auc: 0.767535
[37]	valid_0's auc: 0.767569
[38]	valid_0's auc: 0.767951
[39]	valid_0's auc: 0.767957
[40]	valid_0's auc: 0.767971
[41]	valid_0's auc: 0.767935
[42]	valid_0's auc: 0.772863
[43]	valid_0's auc: 0.772794
[44]	valid_0's auc: 0.772807
[45]	valid_0's auc: 0.772689
[46]	valid_0's auc: 0.773554
[47]	valid_0's auc: 0.772882
[48]	valid_0's auc: 0.773191
[49]	valid_0's auc: 0.773343
[50]	valid_0's auc: 0.773406
[51]	valid_0's auc: 0.773419
[52]	valid_0's auc: 0.773489
[53]	valid_0's auc: 0.773811
[54]	valid_0's auc: 0.774199
[55]	valid_0's auc: 0.774412
[56]	valid_0's auc: 0.774471
[57]	valid_0's auc: 0.774837
[58]	valid_0's auc: 0.774854
[59]	valid_0's auc: 0.774845
[60]	valid_0's auc: 0.775037
[61]	valid_0's auc: 0.775005
[62]	valid_0's auc: 0.775253
[63]	valid_0's auc: 0.775098
[64]	valid_0's auc: 0.775211
[65]	valid_0's auc: 0.775622
[66]	valid_0's auc: 0.775587
[67]	valid_0's auc: 0.775568
[68]	valid_0's auc: 0.775616
[69]	valid_0's auc: 0.775411
[70]	valid_0's auc: 0.776291
[71]	valid_0's auc: 0.776298
[72]	valid_0's auc: 0.776501
[73]	valid_0's auc: 0.77645
[74]	valid_0's auc: 0.776494
[75]	valid_0's auc: 0.776403
[76]	valid_0's auc: 0.775687
[77]	valid_0's auc: 0.77577
[78]	valid_0's auc: 0.776862
[79]	valid_0's auc: 0.776278
[80]	valid_0's auc: 0.777233
[81]	valid_0's auc: 0.777851
[82]	valid_0's auc: 0.778318
[83]	valid_0's auc: 0.778309
[84]	valid_0's auc: 0.778293
[85]	valid_0's auc: 0.778611
[86]	valid_0's auc: 0.778525
[87]	valid_0's auc: 0.77854
[88]	valid_0's auc: 0.778512
[89]	valid_0's auc: 0.778519
[90]	valid_0's auc: 0.778525
[91]	valid_0's auc: 0.778497
[92]	valid_0's auc: 0.778285
[93]	valid_0's auc: 0.778654
[94]	valid_0's auc: 0.77857
[95]	valid_0's auc: 0.778521
[96]	valid_0's auc: 0.778782
[97]	valid_0's auc: 0.778835
[98]	valid_0's auc: 0.779212
[99]	valid_0's auc: 0.779191
[100]	valid_0's auc: 0.779226
[1]	valid_0's auc: 0.759786
[2]	valid_0's auc: 0.764657
[3]	valid_0's auc: 0.765416
[4]	valid_0's auc: 0.765122
[5]	valid_0's auc: 0.765122
[6]	valid_0's auc: 0.76737
[7]	valid_0's auc: 0.76737
[8]	valid_0's auc: 0.768347
[9]	valid_0's auc: 0.768542
[10]	valid_0's auc: 0.769538
[11]	valid_0's auc: 0.769651
[12]	valid_0's auc: 0.769697
[13]	valid_0's auc: 0.769702
[14]	valid_0's auc: 0.769586
[15]	valid_0's auc: 0.769466
[16]	valid_0's auc: 0.769774
[17]	valid_0's auc: 0.769894
[18]	valid_0's auc: 0.769968
[19]	valid_0's auc: 0.769984
[20]	valid_0's auc: 0.769984
[21]	valid_0's auc: 0.769921
[22]	valid_0's auc: 0.770246
[23]	valid_0's auc: 0.770348
[24]	valid_0's auc: 0.770537
[25]	valid_0's auc: 0.772107
[26]	valid_0's auc: 0.772154
[27]	valid_0's auc: 0.77229
[28]	valid_0's auc: 0.772606
[29]	valid_0's auc: 0.772221
[30]	valid_0's auc: 0.77217
[31]	valid_0's auc: 0.772283
[32]	valid_0's auc: 0.772148
[33]	valid_0's auc: 0.773711
[34]	valid_0's auc: 0.773201
[35]	valid_0's auc: 0.772608
[36]	valid_0's auc: 0.773908
[37]	valid_0's auc: 0.774534
[38]	valid_0's auc: 0.774751
[39]	valid_0's auc: 0.774889
[40]	valid_0's auc: 0.776626
[41]	valid_0's auc: 0.776385
[42]	valid_0's auc: 0.776264
[43]	valid_0's auc: 0.776878
[44]	valid_0's auc: 0.777016
[45]	valid_0's auc: 0.777825
[46]	valid_0's auc: 0.778712
[47]	valid_0's auc: 0.778604
[48]	valid_0's auc: 0.778727
[49]	valid_0's auc: 0.779104
[50]	valid_0's auc: 0.779935
[51]	valid_0's auc: 0.78035
[52]	valid_0's auc: 0.781057
[53]	valid_0's auc: 0.781113
[54]	valid_0's auc: 0.781424
[55]	valid_0's auc: 0.781814
[56]	valid_0's auc: 0.781871
[57]	valid_0's auc: 0.782098
[58]	valid_0's auc: 0.782113
[59]	valid_0's auc: 0.782133
[60]	valid_0's auc: 0.782569
[61]	valid_0's auc: 0.782645
[62]	valid_0's auc: 0.782596
[63]	valid_0's auc: 0.782709
[64]	valid_0's auc: 0.782761
[65]	valid_0's auc: 0.78289
[66]	valid_0's auc: 0.782825
[67]	valid_0's auc: 0.782821
[68]	valid_0's auc: 0.782796
[69]	valid_0's auc: 0.783017
[70]	valid_0's auc: 0.783031
[71]	valid_0's auc: 0.783049
[72]	valid_0's auc: 0.783257
[73]	valid_0's auc: 0.783257
[74]	valid_0's auc: 0.783331
[75]	valid_0's auc: 0.783258
[76]	valid_0's auc: 0.783237
[77]	valid_0's auc: 0.783262
[78]	valid_0's auc: 0.783189
[79]	valid_0's auc: 0.783448
[80]	valid_0's auc: 0.783488
[81]	valid_0's auc: 0.783661
[82]	valid_0's auc: 0.783609
[83]	valid_0's auc: 0.783617
[84]	valid_0's auc: 0.783657
[85]	valid_0's auc: 0.783607
[86]	valid_0's auc: 0.783621
[87]	valid_0's auc: 0.783634
[88]	valid_0's auc: 0.783628
[89]	valid_0's auc: 0.783654
[90]	valid_0's auc: 0.783633
[91]	valid_0's auc: 0.783581
[92]	valid_0's auc: 0.783617
[93]	valid_0's auc: 0.783681
[94]	valid_0's auc: 0.783675
[95]	valid_0's auc: 0.783676
[96]	valid_0's auc: 0.783612
[97]	valid_0's auc: 0.783626
[98]	valid_0's auc: 0.783547
[99]	valid_0's auc: 0.783576
[100]	valid_0's auc: 0.783671
[1]	valid_0's auc: 0.762215
[2]	valid_0's auc: 0.772303
[3]	valid_0's auc: 0.775269
[4]	valid_0's auc: 0.774358
[5]	valid_0's auc: 0.77429
[6]	valid_0's auc: 0.77402
[7]	valid_0's auc: 0.77402
[8]	valid_0's auc: 0.775664
[9]	valid_0's auc: 0.775678
[10]	valid_0's auc: 0.775709
[11]	valid_0's auc: 0.775825
[12]	valid_0's auc: 0.775823
[13]	valid_0's auc: 0.775947
[14]	valid_0's auc: 0.777737
[15]	valid_0's auc: 0.778991
[16]	valid_0's auc: 0.778649
[17]	valid_0's auc: 0.778267
[18]	valid_0's auc: 0.778369
[19]	valid_0's auc: 0.778369
[20]	valid_0's auc: 0.778369
[21]	valid_0's auc: 0.778406
[22]	valid_0's auc: 0.778805
[23]	valid_0's auc: 0.77886
[24]	valid_0's auc: 0.778419
[25]	valid_0's auc: 0.778975
[26]	valid_0's auc: 0.779186
[27]	valid_0's auc: 0.779731
[28]	valid_0's auc: 0.780122
[29]	valid_0's auc: 0.780195
[30]	valid_0's auc: 0.780215
[31]	valid_0's auc: 0.780063
[32]	valid_0's auc: 0.780154
[33]	valid_0's auc: 0.781266
[34]	valid_0's auc: 0.781285
[35]	valid_0's auc: 0.781159
[36]	valid_0's auc: 0.781426
[37]	valid_0's auc: 0.781279
[38]	valid_0's auc: 0.781604
[39]	valid_0's auc: 0.78167
[40]	valid_0's auc: 0.781357
[41]	valid_0's auc: 0.781469
[42]	valid_0's auc: 0.781396
[43]	valid_0's auc: 0.781686
[44]	valid_0's auc: 0.781696
[45]	valid_0's auc: 0.78168
[46]	valid_0's auc: 0.781665
[47]	valid_0's auc: 0.78171
[48]	valid_0's auc: 0.781954
[49]	valid_0's auc: 0.781862
[50]	valid_0's auc: 0.781908
[51]	valid_0's auc: 0.781837
[52]	valid_0's auc: 0.781664
[53]	valid_0's auc: 0.781803
[54]	valid_0's auc: 0.781854
[55]	valid_0's auc: 0.782152
[56]	valid_0's auc: 0.783181
[57]	valid_0's auc: 0.783197
[58]	valid_0's auc: 0.783183
[59]	valid_0's auc: 0.783246
[60]	valid_0's auc: 0.78318
[61]	valid_0's auc: 0.783197
[62]	valid_0's auc: 0.783328
[63]	valid_0's auc: 0.7834
[64]	valid_0's auc: 0.78337
[65]	valid_0's auc: 0.7834
[66]	valid_0's auc: 0.783353
[67]	valid_0's auc: 0.783377
[68]	valid_0's auc: 0.783359
[69]	valid_0's auc: 0.783296
[70]	valid_0's auc: 0.78347
[71]	valid_0's auc: 0.783464
[72]	valid_0's auc: 0.783491
[73]	valid_0's auc: 0.78345
[74]	valid_0's auc: 0.783439
[75]	valid_0's auc: 0.783415
[76]	valid_0's auc: 0.783675
[77]	valid_0's auc: 0.783755
[78]	valid_0's auc: 0.783625
[79]	valid_0's auc: 0.783749
[80]	valid_0's auc: 0.783941
[81]	valid_0's auc: 0.78379
[82]	valid_0's auc: 0.784143
[83]	valid_0's auc: 0.784113
[84]	valid_0's auc: 0.784115
[85]	valid_0's auc: 0.783958
[86]	valid_0's auc: 0.784025
[87]	valid_0's auc: 0.784022
[88]	valid_0's auc: 0.784077
[89]	valid_0's auc: 0.784074
[90]	valid_0's auc: 0.784081
[91]	valid_0's auc: 0.78425
[92]	valid_0's auc: 0.784216
[93]	valid_0's auc: 0.784472
[94]	valid_0's auc: 0.784427
[95]	valid_0's auc: 0.78431
[96]	valid_0's auc: 0.784436
[97]	valid_0's auc: 0.784476
[98]	valid_0's auc: 0.784678
[99]	valid_0's auc: 0.784664
[100]	valid_0's auc: 0.784485
[1]	valid_0's auc: 0.760477
[2]	valid_0's auc: 0.760477
[3]	valid_0's auc: 0.761049
[4]	valid_0's auc: 0.762803
[5]	valid_0's auc: 0.762803
[6]	valid_0's auc: 0.764426
[7]	valid_0's auc: 0.764426
[8]	valid_0's auc: 0.763876
[9]	valid_0's auc: 0.764608
[10]	valid_0's auc: 0.764789
[11]	valid_0's auc: 0.764761
[12]	valid_0's auc: 0.764618
[13]	valid_0's auc: 0.770857
[14]	valid_0's auc: 0.770718
[15]	valid_0's auc: 0.770783
[16]	valid_0's auc: 0.771002
[17]	valid_0's auc: 0.771467
[18]	valid_0's auc: 0.771629
[19]	valid_0's auc: 0.77172
[20]	valid_0's auc: 0.77172
[21]	valid_0's auc: 0.771722
[22]	valid_0's auc: 0.771593
[23]	valid_0's auc: 0.771624
[24]	valid_0's auc: 0.771752
[25]	valid_0's auc: 0.771876
[26]	valid_0's auc: 0.772083
[27]	valid_0's auc: 0.773213
[28]	valid_0's auc: 0.773004
[29]	valid_0's auc: 0.773093
[30]	valid_0's auc: 0.772967
[31]	valid_0's auc: 0.773117
[32]	valid_0's auc: 0.773116
[33]	valid_0's auc: 0.773506
[34]	valid_0's auc: 0.773394
[35]	valid_0's auc: 0.773326
[36]	valid_0's auc: 0.773443
[37]	valid_0's auc: 0.773701
[38]	valid_0's auc: 0.77445
[39]	valid_0's auc: 0.774493
[40]	valid_0's auc: 0.774396
[41]	valid_0's auc: 0.774179
[42]	valid_0's auc: 0.774358
[43]	valid_0's auc: 0.775014
[44]	valid_0's auc: 0.774964
[45]	valid_0's auc: 0.775465
[46]	valid_0's auc: 0.776707
[47]	valid_0's auc: 0.776664
[48]	valid_0's auc: 0.776713
[49]	valid_0's auc: 0.77668
[50]	valid_0's auc: 0.777371
[51]	valid_0's auc: 0.777236
[52]	valid_0's auc: 0.77733
[53]	valid_0's auc: 0.777753
[54]	valid_0's auc: 0.777805
[55]	valid_0's auc: 0.777714
[56]	valid_0's auc: 0.778697
[57]	valid_0's auc: 0.778794
[58]	valid_0's auc: 0.778777
[59]	valid_0's auc: 0.77882
[60]	valid_0's auc: 0.779645
[61]	valid_0's auc: 0.779869
[62]	valid_0's auc: 0.779814
[63]	valid_0's auc: 0.779875
[64]	valid_0's auc: 0.779835
[65]	valid_0's auc: 0.779921
[66]	valid_0's auc: 0.779885
[67]	valid_0's auc: 0.779928
[68]	valid_0's auc: 0.779884
[69]	valid_0's auc: 0.779864
[70]	valid_0's auc: 0.779671
[71]	valid_0's auc: 0.779628
[72]	valid_0's auc: 0.7797
[73]	valid_0's auc: 0.779669
[74]	valid_0's auc: 0.779662
[75]	valid_0's auc: 0.779602
[76]	valid_0's auc: 0.779586
[77]	valid_0's auc: 0.779785
[78]	valid_0's auc: 0.779903
[79]	valid_0's auc: 0.780292
[80]	valid_0's auc: 0.780547
[81]	valid_0's auc: 0.780595
[82]	valid_0's auc: 0.780638
[83]	valid_0's auc: 0.7806
[84]	valid_0's auc: 0.78062
[85]	valid_0's auc: 0.781164
[86]	valid_0's auc: 0.78102
[87]	valid_0's auc: 0.781019
[88]	valid_0's auc: 0.781117
[89]	valid_0's auc: 0.781135
[90]	valid_0's auc: 0.78102
[91]	valid_0's auc: 0.781003
[92]	valid_0's auc: 0.780928
[93]	valid_0's auc: 0.781373
[94]	valid_0's auc: 0.781373
[95]	valid_0's auc: 0.781288
[96]	valid_0's auc: 0.781271
[97]	valid_0's auc: 0.781279
[98]	valid_0's auc: 0.781668
[99]	valid_0's auc: 0.781582
[100]	valid_0's auc: 0.781647
[1]	valid_0's auc: 0.7495
[2]	valid_0's auc: 0.7495
[3]	valid_0's auc: 0.7495
[4]	valid_0's auc: 0.7495
[5]	valid_0's auc: 0.7495
[6]	valid_0's auc: 0.7495
[7]	valid_0's auc: 0.7495
[8]	valid_0's auc: 0.7495
[9]	valid_0's auc: 0.7495
[10]	valid_0's auc: 0.7495
[11]	valid_0's auc: 0.7495
[12]	valid_0's auc: 0.7495
[13]	valid_0's auc: 0.7495
[14]	valid_0's auc: 0.7495
[15]	valid_0's auc: 0.7495
[16]	valid_0's auc: 0.7495
[17]	valid_0's auc: 0.7495
[18]	valid_0's auc: 0.7495
[19]	valid_0's auc: 0.7495
[20]	valid_0's auc: 0.7495
[21]	valid_0's auc: 0.7495
[22]	valid_0's auc: 0.7495
[23]	valid_0's auc: 0.7495
[24]	valid_0's auc: 0.7495
[25]	valid_0's auc: 0.7495
[26]	valid_0's auc: 0.7495
[27]	valid_0's auc: 0.7495
[28]	valid_0's auc: 0.7495
[29]	valid_0's auc: 0.7495
[30]	valid_0's auc: 0.7495
[31]	valid_0's auc: 0.7495
[32]	valid_0's auc: 0.7495
[33]	valid_0's auc: 0.7495
[34]	valid_0's auc: 0.7495
[35]	valid_0's auc: 0.7495
[36]	valid_0's auc: 0.7495
[37]	valid_0's auc: 0.7495
[38]	valid_0's auc: 0.7495
[39]	valid_0's auc: 0.7495
[40]	valid_0's auc: 0.7495
[41]	valid_0's auc: 0.7495
[42]	valid_0's auc: 0.7495
[43]	valid_0's auc: 0.7495
[44]	valid_0's auc: 0.7495
[45]	valid_0's auc: 0.7495
[46]	valid_0's auc: 0.7495
[47]	valid_0's auc: 0.7495
[48]	valid_0's auc: 0.7495
[49]	valid_0's auc: 0.7495
[50]	valid_0's auc: 0.7495
[51]	valid_0's auc: 0.7495
[52]	valid_0's auc: 0.7495
[53]	valid_0's auc: 0.7495
[54]	valid_0's auc: 0.7495
[55]	valid_0's auc: 0.7495
[56]	valid_0's auc: 0.7495
[57]	valid_0's auc: 0.7495
[58]	valid_0's auc: 0.7495
[59]	valid_0's auc: 0.7495
[60]	valid_0's auc: 0.7495
[61]	valid_0's auc: 0.7495
[62]	valid_0's auc: 0.7495
[63]	valid_0's auc: 0.7495
[64]	valid_0's auc: 0.7495
[65]	valid_0's auc: 0.7495
[66]	valid_0's auc: 0.7495
[67]	valid_0's auc: 0.7495
[68]	valid_0's auc: 0.7495
[69]	valid_0's auc: 0.7495
[70]	valid_0's auc: 0.7495
[71]	valid_0's auc: 0.7495
[72]	valid_0's auc: 0.7495
[73]	valid_0's auc: 0.7495
[74]	valid_0's auc: 0.7495
[75]	valid_0's auc: 0.7495
[76]	valid_0's auc: 0.7495
[77]	valid_0's auc: 0.7495
[78]	valid_0's auc: 0.7495
[79]	valid_0's auc: 0.7495
[80]	valid_0's auc: 0.7495
[81]	valid_0's auc: 0.7495
[82]	valid_0's auc: 0.7495
[83]	valid_0's auc: 0.7495
[84]	valid_0's auc: 0.7495
[85]	valid_0's auc: 0.7495
[86]	valid_0's auc: 0.7495
[87]	valid_0's auc: 0.7495
[88]	valid_0's auc: 0.7495
[89]	valid_0's auc: 0.7495
[90]	valid_0's auc: 0.7495
[91]	valid_0's auc: 0.7495
[92]	valid_0's auc: 0.7495
[93]	valid_0's auc: 0.7495
[94]	valid_0's auc: 0.7495
[95]	valid_0's auc: 0.7495
[96]	valid_0's auc: 0.7495
[97]	valid_0's auc: 0.7495
[98]	valid_0's auc: 0.7495
[99]	valid_0's auc: 0.7495
[100]	valid_0's auc: 0.7495
[1]	valid_0's auc: 0.748453
[2]	valid_0's auc: 0.748453
[3]	valid_0's auc: 0.748453
[4]	valid_0's auc: 0.748453
[5]	valid_0's auc: 0.748453
[6]	valid_0's auc: 0.748453
[7]	valid_0's auc: 0.748453
[8]	valid_0's auc: 0.748453
[9]	valid_0's auc: 0.748453
[10]	valid_0's auc: 0.748453
[11]	valid_0's auc: 0.748453
[12]	valid_0's auc: 0.748453
[13]	valid_0's auc: 0.748453
[14]	valid_0's auc: 0.748453
[15]	valid_0's auc: 0.748453
[16]	valid_0's auc: 0.748453
[17]	valid_0's auc: 0.748453
[18]	valid_0's auc: 0.748453
[19]	valid_0's auc: 0.748453
[20]	valid_0's auc: 0.748453
[21]	valid_0's auc: 0.748453
[22]	valid_0's auc: 0.748453
[23]	valid_0's auc: 0.748453
[24]	valid_0's auc: 0.748453
[25]	valid_0's auc: 0.748453
[26]	valid_0's auc: 0.748453
[27]	valid_0's auc: 0.748453
[28]	valid_0's auc: 0.748453
[29]	valid_0's auc: 0.748453
[30]	valid_0's auc: 0.748453
[31]	valid_0's auc: 0.748453
[32]	valid_0's auc: 0.748453
[33]	valid_0's auc: 0.748453
[34]	valid_0's auc: 0.748453
[35]	valid_0's auc: 0.748453
[36]	valid_0's auc: 0.748453
[37]	valid_0's auc: 0.748453
[38]	valid_0's auc: 0.748453
[39]	valid_0's auc: 0.748453
[40]	valid_0's auc: 0.748453
[41]	valid_0's auc: 0.748453
[42]	valid_0's auc: 0.748453
[43]	valid_0's auc: 0.748453
[44]	valid_0's auc: 0.748453
[45]	valid_0's auc: 0.748453
[46]	valid_0's auc: 0.748453
[47]	valid_0's auc: 0.748453
[48]	valid_0's auc: 0.748453
[49]	valid_0's auc: 0.748453
[50]	valid_0's auc: 0.748453
[51]	valid_0's auc: 0.748453
[52]	valid_0's auc: 0.748453
[53]	valid_0's auc: 0.748453
[54]	valid_0's auc: 0.748453
[55]	valid_0's auc: 0.748453
[56]	valid_0's auc: 0.748453
[57]	valid_0's auc: 0.748453
[58]	valid_0's auc: 0.748453
[59]	valid_0's auc: 0.748453
[60]	valid_0's auc: 0.748453
[61]	valid_0's auc: 0.748453
[62]	valid_0's auc: 0.748453
[63]	valid_0's auc: 0.748453
[64]	valid_0's auc: 0.748453
[65]	valid_0's auc: 0.748453
[66]	valid_0's auc: 0.748453
[67]	valid_0's auc: 0.748453
[68]	valid_0's auc: 0.748453
[69]	valid_0's auc: 0.748453
[70]	valid_0's auc: 0.748453
[71]	valid_0's auc: 0.748453
[72]	valid_0's auc: 0.748453
[73]	valid_0's auc: 0.748453
[74]	valid_0's auc: 0.748453
[75]	valid_0's auc: 0.748453
[76]	valid_0's auc: 0.748453
[77]	valid_0's auc: 0.748453
[78]	valid_0's auc: 0.748453
[79]	valid_0's auc: 0.748453
[80]	valid_0's auc: 0.748453
[81]	valid_0's auc: 0.748453
[82]	valid_0's auc: 0.748453
[83]	valid_0's auc: 0.748453
[84]	valid_0's auc: 0.748453
[85]	valid_0's auc: 0.748453
[86]	valid_0's auc: 0.748453
[87]	valid_0's auc: 0.748453
[88]	valid_0's auc: 0.748453
[89]	valid_0's auc: 0.748453
[90]	valid_0's auc: 0.748453
[91]	valid_0's auc: 0.748453
[92]	valid_0's auc: 0.748453
[93]	valid_0's auc: 0.748453
[94]	valid_0's auc: 0.748453
[95]	valid_0's auc: 0.748453
[96]	valid_0's auc: 0.748453
[97]	valid_0's auc: 0.748453
[98]	valid_0's auc: 0.749386
[99]	valid_0's auc: 0.749386
[100]	valid_0's auc: 0.749386
[1]	valid_0's auc: 0.748867
[2]	valid_0's auc: 0.748867
[3]	valid_0's auc: 0.748867
[4]	valid_0's auc: 0.748867
[5]	valid_0's auc: 0.748867
[6]	valid_0's auc: 0.748867
[7]	valid_0's auc: 0.748867
[8]	valid_0's auc: 0.748867
[9]	valid_0's auc: 0.748867
[10]	valid_0's auc: 0.748867
[11]	valid_0's auc: 0.748867
[12]	valid_0's auc: 0.748867
[13]	valid_0's auc: 0.748867
[14]	valid_0's auc: 0.748867
[15]	valid_0's auc: 0.748867
[16]	valid_0's auc: 0.748867
[17]	valid_0's auc: 0.748867
[18]	valid_0's auc: 0.748867
[19]	valid_0's auc: 0.748867
[20]	valid_0's auc: 0.748867
[21]	valid_0's auc: 0.748867
[22]	valid_0's auc: 0.748867
[23]	valid_0's auc: 0.748867
[24]	valid_0's auc: 0.748867
[25]	valid_0's auc: 0.748867
[26]	valid_0's auc: 0.748867
[27]	valid_0's auc: 0.748867
[28]	valid_0's auc: 0.748867
[29]	valid_0's auc: 0.748867
[30]	valid_0's auc: 0.748867
[31]	valid_0's auc: 0.748867
[32]	valid_0's auc: 0.748867
[33]	valid_0's auc: 0.748867
[34]	valid_0's auc: 0.748867
[35]	valid_0's auc: 0.748867
[36]	valid_0's auc: 0.748867
[37]	valid_0's auc: 0.748867
[38]	valid_0's auc: 0.748867
[39]	valid_0's auc: 0.748867
[40]	valid_0's auc: 0.748867
[41]	valid_0's auc: 0.748867
[42]	valid_0's auc: 0.748867
[43]	valid_0's auc: 0.748867
[44]	valid_0's auc: 0.748867
[45]	valid_0's auc: 0.748867
[46]	valid_0's auc: 0.748867
[47]	valid_0's auc: 0.748867
[48]	valid_0's auc: 0.748867
[49]	valid_0's auc: 0.748867
[50]	valid_0's auc: 0.748867
[51]	valid_0's auc: 0.748867
[52]	valid_0's auc: 0.748867
[53]	valid_0's auc: 0.748867
[54]	valid_0's auc: 0.748867
[55]	valid_0's auc: 0.748867
[56]	valid_0's auc: 0.757715
[57]	valid_0's auc: 0.757715
[58]	valid_0's auc: 0.757715
[59]	valid_0's auc: 0.757715
[60]	valid_0's auc: 0.757715
[61]	valid_0's auc: 0.757715
[62]	valid_0's auc: 0.757831
[63]	valid_0's auc: 0.757831
[64]	valid_0's auc: 0.757831
[65]	valid_0's auc: 0.757831
[66]	valid_0's auc: 0.757831
[67]	valid_0's auc: 0.757831
[68]	valid_0's auc: 0.757831
[69]	valid_0's auc: 0.757831
[70]	valid_0's auc: 0.757831
[71]	valid_0's auc: 0.757831
[72]	valid_0's auc: 0.757899
[73]	valid_0's auc: 0.757899
[74]	valid_0's auc: 0.757899
[75]	valid_0's auc: 0.75784
[76]	valid_0's auc: 0.75784
[77]	valid_0's auc: 0.75784
[78]	valid_0's auc: 0.757942
[79]	valid_0's auc: 0.758636
[80]	valid_0's auc: 0.758669
[81]	valid_0's auc: 0.758661
[82]	valid_0's auc: 0.759759
[83]	valid_0's auc: 0.759759
[84]	valid_0's auc: 0.760099
[85]	valid_0's auc: 0.759989
[86]	valid_0's auc: 0.759989
[87]	valid_0's auc: 0.759989
[88]	valid_0's auc: 0.759989
[89]	valid_0's auc: 0.759682
[90]	valid_0's auc: 0.759989
[91]	valid_0's auc: 0.759682
[92]	valid_0's auc: 0.759893
[93]	valid_0's auc: 0.760097
[94]	valid_0's auc: 0.760097
[95]	valid_0's auc: 0.760099
[96]	valid_0's auc: 0.76013
[97]	valid_0's auc: 0.76013
[98]	valid_0's auc: 0.760099
[99]	valid_0's auc: 0.760099
[100]	valid_0's auc: 0.760164
[1]	valid_0's auc: 0.748286
[2]	valid_0's auc: 0.748286
[3]	valid_0's auc: 0.748286
[4]	valid_0's auc: 0.748286
[5]	valid_0's auc: 0.748286
[6]	valid_0's auc: 0.748286
[7]	valid_0's auc: 0.748286
[8]	valid_0's auc: 0.748286
[9]	valid_0's auc: 0.748286
[10]	valid_0's auc: 0.748286
[11]	valid_0's auc: 0.748286
[12]	valid_0's auc: 0.748286
[13]	valid_0's auc: 0.748286
[14]	valid_0's auc: 0.748286
[15]	valid_0's auc: 0.748286
[16]	valid_0's auc: 0.748286
[17]	valid_0's auc: 0.748286
[18]	valid_0's auc: 0.748286
[19]	valid_0's auc: 0.748286
[20]	valid_0's auc: 0.748286
[21]	valid_0's auc: 0.748286
[22]	valid_0's auc: 0.748286
[23]	valid_0's auc: 0.748286
[24]	valid_0's auc: 0.748286
[25]	valid_0's auc: 0.748286
[26]	valid_0's auc: 0.748286
[27]	valid_0's auc: 0.748286
[28]	valid_0's auc: 0.748286
[29]	valid_0's auc: 0.748286
[30]	valid_0's auc: 0.748286
[31]	valid_0's auc: 0.748286
[32]	valid_0's auc: 0.748286
[33]	valid_0's auc: 0.748286
[34]	valid_0's auc: 0.748286
[35]	valid_0's auc: 0.748286
[36]	valid_0's auc: 0.748286
[37]	valid_0's auc: 0.748286
[38]	valid_0's auc: 0.748286
[39]	valid_0's auc: 0.748286
[40]	valid_0's auc: 0.748286
[41]	valid_0's auc: 0.748286
[42]	valid_0's auc: 0.748286
[43]	valid_0's auc: 0.748286
[44]	valid_0's auc: 0.748286
[45]	valid_0's auc: 0.748286
[46]	valid_0's auc: 0.748286
[47]	valid_0's auc: 0.748286
[48]	valid_0's auc: 0.748286
[49]	valid_0's auc: 0.748286
[50]	valid_0's auc: 0.748286
[51]	valid_0's auc: 0.748286
[52]	valid_0's auc: 0.748286
[53]	valid_0's auc: 0.748286
[54]	valid_0's auc: 0.748286
[55]	valid_0's auc: 0.748286
[56]	valid_0's auc: 0.748286
[57]	valid_0's auc: 0.748286
[58]	valid_0's auc: 0.748286
[59]	valid_0's auc: 0.748286
[60]	valid_0's auc: 0.748286
[61]	valid_0's auc: 0.748286
[62]	valid_0's auc: 0.748286
[63]	valid_0's auc: 0.748286
[64]	valid_0's auc: 0.748286
[65]	valid_0's auc: 0.748286
[66]	valid_0's auc: 0.748286
[67]	valid_0's auc: 0.748286
[68]	valid_0's auc: 0.748286
[69]	valid_0's auc: 0.748286
[70]	valid_0's auc: 0.748286
[71]	valid_0's auc: 0.748286
[72]	valid_0's auc: 0.748286
[73]	valid_0's auc: 0.748286
[74]	valid_0's auc: 0.748286
[75]	valid_0's auc: 0.748286
[76]	valid_0's auc: 0.748286
[77]	valid_0's auc: 0.748286
[78]	valid_0's auc: 0.748286
[79]	valid_0's auc: 0.748286
[80]	valid_0's auc: 0.748286
[81]	valid_0's auc: 0.748286
[82]	valid_0's auc: 0.748286
[83]	valid_0's auc: 0.748286
[84]	valid_0's auc: 0.748286
[85]	valid_0's auc: 0.748286
[86]	valid_0's auc: 0.748286
[87]	valid_0's auc: 0.748286
[88]	valid_0's auc: 0.748286
[89]	valid_0's auc: 0.748286
[90]	valid_0's auc: 0.748286
[91]	valid_0's auc: 0.748286
[92]	valid_0's auc: 0.748286
[93]	valid_0's auc: 0.748286
[94]	valid_0's auc: 0.748286
[95]	valid_0's auc: 0.748286
[96]	valid_0's auc: 0.748286
[97]	valid_0's auc: 0.748286
[98]	valid_0's auc: 0.748286
[99]	valid_0's auc: 0.748286
[100]	valid_0's auc: 0.748286
[1]	valid_0's auc: 0.757043
[2]	valid_0's auc: 0.757043
[3]	valid_0's auc: 0.757043
[4]	valid_0's auc: 0.757043
[5]	valid_0's auc: 0.757043
[6]	valid_0's auc: 0.757043
[7]	valid_0's auc: 0.757043
[8]	valid_0's auc: 0.757043
[9]	valid_0's auc: 0.757043
[10]	valid_0's auc: 0.757043
[11]	valid_0's auc: 0.757043
[12]	valid_0's auc: 0.757043
[13]	valid_0's auc: 0.757043
[14]	valid_0's auc: 0.757043
[15]	valid_0's auc: 0.757043
[16]	valid_0's auc: 0.757043
[17]	valid_0's auc: 0.757043
[18]	valid_0's auc: 0.757043
[19]	valid_0's auc: 0.757043
[20]	valid_0's auc: 0.757043
[21]	valid_0's auc: 0.757043
[22]	valid_0's auc: 0.757043
[23]	valid_0's auc: 0.757043
[24]	valid_0's auc: 0.757043
[25]	valid_0's auc: 0.757043
[26]	valid_0's auc: 0.757043
[27]	valid_0's auc: 0.757043
[28]	valid_0's auc: 0.757043
[29]	valid_0's auc: 0.757043
[30]	valid_0's auc: 0.757043
[31]	valid_0's auc: 0.757043
[32]	valid_0's auc: 0.757043
[33]	valid_0's auc: 0.757043
[34]	valid_0's auc: 0.757043
[35]	valid_0's auc: 0.757043
[36]	valid_0's auc: 0.757043
[37]	valid_0's auc: 0.757043
[38]	valid_0's auc: 0.757043
[39]	valid_0's auc: 0.757043
[40]	valid_0's auc: 0.757043
[41]	valid_0's auc: 0.757043
[42]	valid_0's auc: 0.757043
[43]	valid_0's auc: 0.757043
[44]	valid_0's auc: 0.757043
[45]	valid_0's auc: 0.757043
[46]	valid_0's auc: 0.757043
[47]	valid_0's auc: 0.757043
[48]	valid_0's auc: 0.757043
[49]	valid_0's auc: 0.757043
[50]	valid_0's auc: 0.757043
[51]	valid_0's auc: 0.757043
[52]	valid_0's auc: 0.757043
[53]	valid_0's auc: 0.757043
[54]	valid_0's auc: 0.757043
[55]	valid_0's auc: 0.757043
[56]	valid_0's auc: 0.757043
[57]	valid_0's auc: 0.757043
[58]	valid_0's auc: 0.757043
[59]	valid_0's auc: 0.757043
[60]	valid_0's auc: 0.757043
[61]	valid_0's auc: 0.757043
[62]	valid_0's auc: 0.757043
[63]	valid_0's auc: 0.757043
[64]	valid_0's auc: 0.757043
[65]	valid_0's auc: 0.757043
[66]	valid_0's auc: 0.757043
[67]	valid_0's auc: 0.757043
[68]	valid_0's auc: 0.757043
[69]	valid_0's auc: 0.757043
[70]	valid_0's auc: 0.757043
[71]	valid_0's auc: 0.757043
[72]	valid_0's auc: 0.757043
[73]	valid_0's auc: 0.757043
[74]	valid_0's auc: 0.757043
[75]	valid_0's auc: 0.757043
[76]	valid_0's auc: 0.757043
[77]	valid_0's auc: 0.757043
[78]	valid_0's auc: 0.757043
[79]	valid_0's auc: 0.757043
[80]	valid_0's auc: 0.757043
[81]	valid_0's auc: 0.757043
[82]	valid_0's auc: 0.757043
[83]	valid_0's auc: 0.757043
[84]	valid_0's auc: 0.757043
[85]	valid_0's auc: 0.757043
[86]	valid_0's auc: 0.757043
[87]	valid_0's auc: 0.757043
[88]	valid_0's auc: 0.757043
[89]	valid_0's auc: 0.757043
[90]	valid_0's auc: 0.757043
[91]	valid_0's auc: 0.757043
[92]	valid_0's auc: 0.757043
[93]	valid_0's auc: 0.757043
[94]	valid_0's auc: 0.757043
[95]	valid_0's auc: 0.757043
[96]	valid_0's auc: 0.757043
[97]	valid_0's auc: 0.757043
[98]	valid_0's auc: 0.757043
[99]	valid_0's auc: 0.757043
[100]	valid_0's auc: 0.757043
[1]	valid_0's auc: 0.752298
[2]	valid_0's auc: 0.752298
[3]	valid_0's auc: 0.752298
[4]	valid_0's auc: 0.752298
[5]	valid_0's auc: 0.752298
[6]	valid_0's auc: 0.752298
[7]	valid_0's auc: 0.752298
[8]	valid_0's auc: 0.752298
[9]	valid_0's auc: 0.752298
[10]	valid_0's auc: 0.752298
[11]	valid_0's auc: 0.752298
[12]	valid_0's auc: 0.752298
[13]	valid_0's auc: 0.752298
[14]	valid_0's auc: 0.752298
[15]	valid_0's auc: 0.752298
[16]	valid_0's auc: 0.752298
[17]	valid_0's auc: 0.752298
[18]	valid_0's auc: 0.752298
[19]	valid_0's auc: 0.752298
[20]	valid_0's auc: 0.752298
[21]	valid_0's auc: 0.752298
[22]	valid_0's auc: 0.752298
[23]	valid_0's auc: 0.752298
[24]	valid_0's auc: 0.752298
[25]	valid_0's auc: 0.752298
[26]	valid_0's auc: 0.752298
[27]	valid_0's auc: 0.752298
[28]	valid_0's auc: 0.752298
[29]	valid_0's auc: 0.752298
[30]	valid_0's auc: 0.752298
[31]	valid_0's auc: 0.752298
[32]	valid_0's auc: 0.752298
[33]	valid_0's auc: 0.752298
[34]	valid_0's auc: 0.752298
[35]	valid_0's auc: 0.752298
[36]	valid_0's auc: 0.752298
[37]	valid_0's auc: 0.752298
[38]	valid_0's auc: 0.752298
[39]	valid_0's auc: 0.752298
[40]	valid_0's auc: 0.752303
[41]	valid_0's auc: 0.752303
[42]	valid_0's auc: 0.752303
[43]	valid_0's auc: 0.752303
[44]	valid_0's auc: 0.752303
[45]	valid_0's auc: 0.752303
[46]	valid_0's auc: 0.752303
[47]	valid_0's auc: 0.752303
[48]	valid_0's auc: 0.752303
[49]	valid_0's auc: 0.752662
[50]	valid_0's auc: 0.752662
[51]	valid_0's auc: 0.752662
[52]	valid_0's auc: 0.752662
[53]	valid_0's auc: 0.752662
[54]	valid_0's auc: 0.752768
[55]	valid_0's auc: 0.752768
[56]	valid_0's auc: 0.752768
[57]	valid_0's auc: 0.752768
[58]	valid_0's auc: 0.752768
[59]	valid_0's auc: 0.752768
[60]	valid_0's auc: 0.752768
[61]	valid_0's auc: 0.752768
[62]	valid_0's auc: 0.752768
[63]	valid_0's auc: 0.752768
[64]	valid_0's auc: 0.752768
[65]	valid_0's auc: 0.752768
[66]	valid_0's auc: 0.752768
[67]	valid_0's auc: 0.752768
[68]	valid_0's auc: 0.752768
[69]	valid_0's auc: 0.752768
[70]	valid_0's auc: 0.752768
[71]	valid_0's auc: 0.752768
[72]	valid_0's auc: 0.752768
[73]	valid_0's auc: 0.752768
[74]	valid_0's auc: 0.752768
[75]	valid_0's auc: 0.752768
[76]	valid_0's auc: 0.752768
[77]	valid_0's auc: 0.757295
[78]	valid_0's auc: 0.757295
[79]	valid_0's auc: 0.757295
[80]	valid_0's auc: 0.757295
[81]	valid_0's auc: 0.757295
[82]	valid_0's auc: 0.757295
[83]	valid_0's auc: 0.757295
[84]	valid_0's auc: 0.757295
[85]	valid_0's auc: 0.757295
[86]	valid_0's auc: 0.757295
[87]	valid_0's auc: 0.757295
[88]	valid_0's auc: 0.757295
[89]	valid_0's auc: 0.757295
[90]	valid_0's auc: 0.757295
[91]	valid_0's auc: 0.757295
[92]	valid_0's auc: 0.757295
[93]	valid_0's auc: 0.757295
[94]	valid_0's auc: 0.757295
[95]	valid_0's auc: 0.757295
[96]	valid_0's auc: 0.757295
[97]	valid_0's auc: 0.757295
[98]	valid_0's auc: 0.757295
[99]	valid_0's auc: 0.757295
[100]	valid_0's auc: 0.757295
[1]	valid_0's auc: 0.754636
[2]	valid_0's auc: 0.754636
[3]	valid_0's auc: 0.754636
[4]	valid_0's auc: 0.754636
[5]	valid_0's auc: 0.754636
[6]	valid_0's auc: 0.754636
[7]	valid_0's auc: 0.754636
[8]	valid_0's auc: 0.754636
[9]	valid_0's auc: 0.754636
[10]	valid_0's auc: 0.754636
[11]	valid_0's auc: 0.754636
[12]	valid_0's auc: 0.754636
[13]	valid_0's auc: 0.754636
[14]	valid_0's auc: 0.754636
[15]	valid_0's auc: 0.754636
[16]	valid_0's auc: 0.754636
[17]	valid_0's auc: 0.754636
[18]	valid_0's auc: 0.754636
[19]	valid_0's auc: 0.754636
[20]	valid_0's auc: 0.754636
[21]	valid_0's auc: 0.754636
[22]	valid_0's auc: 0.754636
[23]	valid_0's auc: 0.754636
[24]	valid_0's auc: 0.754636
[25]	valid_0's auc: 0.754636
[26]	valid_0's auc: 0.754636
[27]	valid_0's auc: 0.754636
[28]	valid_0's auc: 0.754636
[29]	valid_0's auc: 0.754636
[30]	valid_0's auc: 0.754636
[31]	valid_0's auc: 0.754636
[32]	valid_0's auc: 0.754636
[33]	valid_0's auc: 0.754636
[34]	valid_0's auc: 0.754636
[35]	valid_0's auc: 0.754636
[36]	valid_0's auc: 0.754636
[37]	valid_0's auc: 0.754636
[38]	valid_0's auc: 0.754636
[39]	valid_0's auc: 0.754636
[40]	valid_0's auc: 0.754636
[41]	valid_0's auc: 0.754636
[42]	valid_0's auc: 0.754636
[43]	valid_0's auc: 0.754636
[44]	valid_0's auc: 0.754636
[45]	valid_0's auc: 0.754636
[46]	valid_0's auc: 0.754636
[47]	valid_0's auc: 0.754636
[48]	valid_0's auc: 0.754636
[49]	valid_0's auc: 0.754636
[50]	valid_0's auc: 0.754636
[51]	valid_0's auc: 0.754636
[52]	valid_0's auc: 0.754636
[53]	valid_0's auc: 0.754636
[54]	valid_0's auc: 0.754636
[55]	valid_0's auc: 0.754636
[56]	valid_0's auc: 0.754636
[57]	valid_0's auc: 0.754636
[58]	valid_0's auc: 0.754636
[59]	valid_0's auc: 0.754636
[60]	valid_0's auc: 0.754636
[61]	valid_0's auc: 0.754636
[62]	valid_0's auc: 0.754636
[63]	valid_0's auc: 0.754636
[64]	valid_0's auc: 0.754636
[65]	valid_0's auc: 0.754636
[66]	valid_0's auc: 0.754636
[67]	valid_0's auc: 0.754636
[68]	valid_0's auc: 0.754636
[69]	valid_0's auc: 0.754636
[70]	valid_0's auc: 0.754636
[71]	valid_0's auc: 0.754636
[72]	valid_0's auc: 0.754636
[73]	valid_0's auc: 0.754636
[74]	valid_0's auc: 0.754636
[75]	valid_0's auc: 0.754636
[76]	valid_0's auc: 0.754636
[77]	valid_0's auc: 0.754636
[78]	valid_0's auc: 0.754636
[79]	valid_0's auc: 0.754636
[80]	valid_0's auc: 0.754636
[81]	valid_0's auc: 0.754636
[82]	valid_0's auc: 0.754636
[83]	valid_0's auc: 0.754636
[84]	valid_0's auc: 0.754636
[85]	valid_0's auc: 0.754636
[86]	valid_0's auc: 0.754636
[87]	valid_0's auc: 0.754636
[88]	valid_0's auc: 0.754636
[89]	valid_0's auc: 0.754636
[90]	valid_0's auc: 0.754636
[91]	valid_0's auc: 0.754636
[92]	valid_0's auc: 0.754636
[93]	valid_0's auc: 0.754636
[94]	valid_0's auc: 0.754636
[95]	valid_0's auc: 0.754636
[96]	valid_0's auc: 0.754636
[97]	valid_0's auc: 0.754636
[98]	valid_0's auc: 0.754636
[99]	valid_0's auc: 0.754636
[100]	valid_0's auc: 0.755244
[1]	valid_0's auc: 0.757698
[2]	valid_0's auc: 0.757698
[3]	valid_0's auc: 0.757698
[4]	valid_0's auc: 0.757698
[5]	valid_0's auc: 0.757698
[6]	valid_0's auc: 0.757698
[7]	valid_0's auc: 0.757698
[8]	valid_0's auc: 0.757698
[9]	valid_0's auc: 0.757698
[10]	valid_0's auc: 0.757698
[11]	valid_0's auc: 0.757698
[12]	valid_0's auc: 0.757698
[13]	valid_0's auc: 0.757698
[14]	valid_0's auc: 0.757698
[15]	valid_0's auc: 0.757698
[16]	valid_0's auc: 0.757698
[17]	valid_0's auc: 0.757698
[18]	valid_0's auc: 0.757698
[19]	valid_0's auc: 0.757698
[20]	valid_0's auc: 0.757698
[21]	valid_0's auc: 0.757698
[22]	valid_0's auc: 0.757698
[23]	valid_0's auc: 0.757698
[24]	valid_0's auc: 0.757698
[25]	valid_0's auc: 0.757698
[26]	valid_0's auc: 0.757698
[27]	valid_0's auc: 0.757698
[28]	valid_0's auc: 0.757698
[29]	valid_0's auc: 0.757698
[30]	valid_0's auc: 0.757698
[31]	valid_0's auc: 0.757698
[32]	valid_0's auc: 0.757698
[33]	valid_0's auc: 0.757698
[34]	valid_0's auc: 0.757698
[35]	valid_0's auc: 0.757698
[36]	valid_0's auc: 0.757698
[37]	valid_0's auc: 0.757698
[38]	valid_0's auc: 0.757698
[39]	valid_0's auc: 0.757698
[40]	valid_0's auc: 0.757698
[41]	valid_0's auc: 0.757698
[42]	valid_0's auc: 0.757698
[43]	valid_0's auc: 0.757698
[44]	valid_0's auc: 0.757698
[45]	valid_0's auc: 0.757698
[46]	valid_0's auc: 0.757698
[47]	valid_0's auc: 0.757698
[48]	valid_0's auc: 0.757698
[49]	valid_0's auc: 0.757698
[50]	valid_0's auc: 0.757698
[51]	valid_0's auc: 0.757698
[52]	valid_0's auc: 0.757698
[53]	valid_0's auc: 0.757698
[54]	valid_0's auc: 0.757698
[55]	valid_0's auc: 0.757698
[56]	valid_0's auc: 0.757698
[57]	valid_0's auc: 0.757698
[58]	valid_0's auc: 0.757698
[59]	valid_0's auc: 0.757698
[60]	valid_0's auc: 0.757698
[61]	valid_0's auc: 0.757698
[62]	valid_0's auc: 0.757698
[63]	valid_0's auc: 0.757698
[64]	valid_0's auc: 0.757698
[65]	valid_0's auc: 0.757698
[66]	valid_0's auc: 0.757698
[67]	valid_0's auc: 0.757698
[68]	valid_0's auc: 0.757698
[69]	valid_0's auc: 0.757698
[70]	valid_0's auc: 0.757698
[71]	valid_0's auc: 0.757698
[72]	valid_0's auc: 0.757698
[73]	valid_0's auc: 0.757698
[74]	valid_0's auc: 0.757698
[75]	valid_0's auc: 0.757698
[76]	valid_0's auc: 0.757698
[77]	valid_0's auc: 0.757698
[78]	valid_0's auc: 0.757698
[79]	valid_0's auc: 0.757698
[80]	valid_0's auc: 0.757698
[81]	valid_0's auc: 0.757698
[82]	valid_0's auc: 0.757698
[83]	valid_0's auc: 0.757698
[84]	valid_0's auc: 0.757698
[85]	valid_0's auc: 0.757698
[86]	valid_0's auc: 0.757698
[87]	valid_0's auc: 0.757698
[88]	valid_0's auc: 0.757698
[89]	valid_0's auc: 0.757698
[90]	valid_0's auc: 0.757698
[91]	valid_0's auc: 0.757698
[92]	valid_0's auc: 0.757698
[93]	valid_0's auc: 0.757698
[94]	valid_0's auc: 0.757698
[95]	valid_0's auc: 0.757698
[96]	valid_0's auc: 0.757698
[97]	valid_0's auc: 0.757698
[98]	valid_0's auc: 0.757698
[99]	valid_0's auc: 0.757698
[100]	valid_0's auc: 0.757698
[1]	valid_0's auc: 0.760154
[2]	valid_0's auc: 0.760154
[3]	valid_0's auc: 0.760154
[4]	valid_0's auc: 0.760154
[5]	valid_0's auc: 0.760154
[6]	valid_0's auc: 0.760154
[7]	valid_0's auc: 0.760154
[8]	valid_0's auc: 0.760154
[9]	valid_0's auc: 0.760154
[10]	valid_0's auc: 0.760154
[11]	valid_0's auc: 0.760154
[12]	valid_0's auc: 0.760154
[13]	valid_0's auc: 0.760154
[14]	valid_0's auc: 0.760154
[15]	valid_0's auc: 0.760154
[16]	valid_0's auc: 0.760154
[17]	valid_0's auc: 0.760154
[18]	valid_0's auc: 0.760154
[19]	valid_0's auc: 0.760154
[20]	valid_0's auc: 0.760154
[21]	valid_0's auc: 0.760154
[22]	valid_0's auc: 0.760154
[23]	valid_0's auc: 0.760154
[24]	valid_0's auc: 0.760154
[25]	valid_0's auc: 0.760154
[26]	valid_0's auc: 0.760154
[27]	valid_0's auc: 0.760154
[28]	valid_0's auc: 0.760154
[29]	valid_0's auc: 0.760154
[30]	valid_0's auc: 0.760154
[31]	valid_0's auc: 0.760154
[32]	valid_0's auc: 0.760154
[33]	valid_0's auc: 0.760154
[34]	valid_0's auc: 0.760154
[35]	valid_0's auc: 0.760154
[36]	valid_0's auc: 0.760154
[37]	valid_0's auc: 0.760154
[38]	valid_0's auc: 0.760154
[39]	valid_0's auc: 0.760154
[40]	valid_0's auc: 0.760154
[41]	valid_0's auc: 0.760154
[42]	valid_0's auc: 0.760154
[43]	valid_0's auc: 0.760154
[44]	valid_0's auc: 0.760154
[45]	valid_0's auc: 0.760154
[46]	valid_0's auc: 0.760154
[47]	valid_0's auc: 0.760154
[48]	valid_0's auc: 0.760154
[49]	valid_0's auc: 0.760154
[50]	valid_0's auc: 0.760154
[51]	valid_0's auc: 0.760154
[52]	valid_0's auc: 0.760154
[53]	valid_0's auc: 0.760154
[54]	valid_0's auc: 0.760154
[55]	valid_0's auc: 0.760154
[56]	valid_0's auc: 0.760154
[57]	valid_0's auc: 0.760154
[58]	valid_0's auc: 0.760154
[59]	valid_0's auc: 0.760154
[60]	valid_0's auc: 0.760154
[61]	valid_0's auc: 0.760154
[62]	valid_0's auc: 0.760154
[63]	valid_0's auc: 0.760154
[64]	valid_0's auc: 0.760154
[65]	valid_0's auc: 0.760154
[66]	valid_0's auc: 0.760154
[67]	valid_0's auc: 0.760154
[68]	valid_0's auc: 0.760154
[69]	valid_0's auc: 0.760154
[70]	valid_0's auc: 0.760154
[71]	valid_0's auc: 0.760154
[72]	valid_0's auc: 0.760154
[73]	valid_0's auc: 0.760154
[74]	valid_0's auc: 0.760154
[75]	valid_0's auc: 0.760154
[76]	valid_0's auc: 0.760154
[77]	valid_0's auc: 0.760154
[78]	valid_0's auc: 0.760154
[79]	valid_0's auc: 0.760154
[80]	valid_0's auc: 0.760154
[81]	valid_0's auc: 0.760154
[82]	valid_0's auc: 0.760154
[83]	valid_0's auc: 0.760154
[84]	valid_0's auc: 0.760154
[85]	valid_0's auc: 0.760154
[86]	valid_0's auc: 0.760154
[87]	valid_0's auc: 0.760154
[88]	valid_0's auc: 0.760154
[89]	valid_0's auc: 0.760154
[90]	valid_0's auc: 0.760154
[91]	valid_0's auc: 0.760154
[92]	valid_0's auc: 0.760154
[93]	valid_0's auc: 0.760154
[94]	valid_0's auc: 0.760154
[95]	valid_0's auc: 0.760154
[96]	valid_0's auc: 0.760243
[97]	valid_0's auc: 0.760243
[98]	valid_0's auc: 0.760243
[99]	valid_0's auc: 0.760243
[100]	valid_0's auc: 0.760243
[1]	valid_0's auc: 0.759786
[2]	valid_0's auc: 0.759786
[3]	valid_0's auc: 0.759786
[4]	valid_0's auc: 0.759786
[5]	valid_0's auc: 0.759786
[6]	valid_0's auc: 0.759786
[7]	valid_0's auc: 0.759786
[8]	valid_0's auc: 0.759786
[9]	valid_0's auc: 0.759786
[10]	valid_0's auc: 0.759786
[11]	valid_0's auc: 0.759786
[12]	valid_0's auc: 0.759786
[13]	valid_0's auc: 0.759786
[14]	valid_0's auc: 0.759786
[15]	valid_0's auc: 0.759786
[16]	valid_0's auc: 0.759786
[17]	valid_0's auc: 0.759786
[18]	valid_0's auc: 0.759786
[19]	valid_0's auc: 0.759786
[20]	valid_0's auc: 0.759786
[21]	valid_0's auc: 0.759786
[22]	valid_0's auc: 0.759786
[23]	valid_0's auc: 0.759786
[24]	valid_0's auc: 0.759786
[25]	valid_0's auc: 0.759786
[26]	valid_0's auc: 0.759786
[27]	valid_0's auc: 0.759786
[28]	valid_0's auc: 0.759786
[29]	valid_0's auc: 0.759786
[30]	valid_0's auc: 0.759786
[31]	valid_0's auc: 0.759786
[32]	valid_0's auc: 0.759786
[33]	valid_0's auc: 0.759786
[34]	valid_0's auc: 0.759786
[35]	valid_0's auc: 0.759786
[36]	valid_0's auc: 0.759786
[37]	valid_0's auc: 0.759786
[38]	valid_0's auc: 0.759786
[39]	valid_0's auc: 0.759786
[40]	valid_0's auc: 0.75979
[41]	valid_0's auc: 0.75979
[42]	valid_0's auc: 0.75979
[43]	valid_0's auc: 0.75979
[44]	valid_0's auc: 0.75979
[45]	valid_0's auc: 0.75979
[46]	valid_0's auc: 0.75979
[47]	valid_0's auc: 0.75979
[48]	valid_0's auc: 0.75979
[49]	valid_0's auc: 0.759833
[50]	valid_0's auc: 0.759833
[51]	valid_0's auc: 0.759833
[52]	valid_0's auc: 0.759833
[53]	valid_0's auc: 0.759833
[54]	valid_0's auc: 0.759939
[55]	valid_0's auc: 0.759939
[56]	valid_0's auc: 0.759939
[57]	valid_0's auc: 0.759939
[58]	valid_0's auc: 0.759939
[59]	valid_0's auc: 0.759939
[60]	valid_0's auc: 0.759939
[61]	valid_0's auc: 0.759939
[62]	valid_0's auc: 0.759939
[63]	valid_0's auc: 0.759939
[64]	valid_0's auc: 0.759939
[65]	valid_0's auc: 0.759939
[66]	valid_0's auc: 0.759939
[67]	valid_0's auc: 0.759939
[68]	valid_0's auc: 0.759939
[69]	valid_0's auc: 0.759939
[70]	valid_0's auc: 0.759939
[71]	valid_0's auc: 0.759939
[72]	valid_0's auc: 0.759939
[73]	valid_0's auc: 0.759939
[74]	valid_0's auc: 0.759939
[75]	valid_0's auc: 0.759939
[76]	valid_0's auc: 0.759939
[77]	valid_0's auc: 0.759939
[78]	valid_0's auc: 0.759939
[79]	valid_0's auc: 0.759939
[80]	valid_0's auc: 0.759939
[81]	valid_0's auc: 0.759939
[82]	valid_0's auc: 0.759939
[83]	valid_0's auc: 0.759939
[84]	valid_0's auc: 0.759939
[85]	valid_0's auc: 0.759939
[86]	valid_0's auc: 0.759939
[87]	valid_0's auc: 0.759939
[88]	valid_0's auc: 0.759939
[89]	valid_0's auc: 0.759939
[90]	valid_0's auc: 0.759939
[91]	valid_0's auc: 0.759939
[92]	valid_0's auc: 0.759939
[93]	valid_0's auc: 0.759939
[94]	valid_0's auc: 0.759939
[95]	valid_0's auc: 0.759939
[96]	valid_0's auc: 0.759939
[97]	valid_0's auc: 0.759939
[98]	valid_0's auc: 0.759939
[99]	valid_0's auc: 0.759939
[100]	valid_0's auc: 0.759939
[1]	valid_0's auc: 0.762215
[2]	valid_0's auc: 0.762215
[3]	valid_0's auc: 0.762215
[4]	valid_0's auc: 0.762215
[5]	valid_0's auc: 0.762215
[6]	valid_0's auc: 0.762215
[7]	valid_0's auc: 0.762215
[8]	valid_0's auc: 0.762215
[9]	valid_0's auc: 0.762215
[10]	valid_0's auc: 0.762215
[11]	valid_0's auc: 0.762215
[12]	valid_0's auc: 0.762215
[13]	valid_0's auc: 0.762215
[14]	valid_0's auc: 0.762215
[15]	valid_0's auc: 0.762215
[16]	valid_0's auc: 0.762215
[17]	valid_0's auc: 0.762215
[18]	valid_0's auc: 0.762215
[19]	valid_0's auc: 0.762215
[20]	valid_0's auc: 0.762215
[21]	valid_0's auc: 0.762215
[22]	valid_0's auc: 0.762215
[23]	valid_0's auc: 0.762215
[24]	valid_0's auc: 0.762215
[25]	valid_0's auc: 0.762215
[26]	valid_0's auc: 0.762215
[27]	valid_0's auc: 0.762215
[28]	valid_0's auc: 0.762215
[29]	valid_0's auc: 0.762215
[30]	valid_0's auc: 0.762215
[31]	valid_0's auc: 0.762215
[32]	valid_0's auc: 0.762215
[33]	valid_0's auc: 0.762215
[34]	valid_0's auc: 0.762215
[35]	valid_0's auc: 0.762215
[36]	valid_0's auc: 0.762215
[37]	valid_0's auc: 0.762215
[38]	valid_0's auc: 0.762215
[39]	valid_0's auc: 0.762215
[40]	valid_0's auc: 0.762215
[41]	valid_0's auc: 0.762215
[42]	valid_0's auc: 0.762215
[43]	valid_0's auc: 0.762215
[44]	valid_0's auc: 0.762215
[45]	valid_0's auc: 0.762215
[46]	valid_0's auc: 0.762215
[47]	valid_0's auc: 0.762215
[48]	valid_0's auc: 0.762215
[49]	valid_0's auc: 0.762215
[50]	valid_0's auc: 0.762215
[51]	valid_0's auc: 0.762215
[52]	valid_0's auc: 0.762512
[53]	valid_0's auc: 0.762512
[54]	valid_0's auc: 0.762512
[55]	valid_0's auc: 0.762512
[56]	valid_0's auc: 0.762512
[57]	valid_0's auc: 0.762512
[58]	valid_0's auc: 0.762512
[59]	valid_0's auc: 0.762512
[60]	valid_0's auc: 0.762512
[61]	valid_0's auc: 0.762512
[62]	valid_0's auc: 0.762512
[63]	valid_0's auc: 0.762512
[64]	valid_0's auc: 0.762512
[65]	valid_0's auc: 0.762512
[66]	valid_0's auc: 0.762512
[67]	valid_0's auc: 0.762512
[68]	valid_0's auc: 0.762512
[69]	valid_0's auc: 0.762512
[70]	valid_0's auc: 0.762512
[71]	valid_0's auc: 0.762512
[72]	valid_0's auc: 0.762512
[73]	valid_0's auc: 0.762512
[74]	valid_0's auc: 0.762512
[75]	valid_0's auc: 0.762512
[76]	valid_0's auc: 0.762512
[77]	valid_0's auc: 0.762512
[78]	valid_0's auc: 0.762512
[79]	valid_0's auc: 0.762512
[80]	valid_0's auc: 0.762562
[81]	valid_0's auc: 0.762562
[82]	valid_0's auc: 0.76264
[83]	valid_0's auc: 0.76264
[84]	valid_0's auc: 0.76264
[85]	valid_0's auc: 0.76264
[86]	valid_0's auc: 0.76264
[87]	valid_0's auc: 0.76264
[88]	valid_0's auc: 0.76264
[89]	valid_0's auc: 0.76264
[90]	valid_0's auc: 0.76264
[91]	valid_0's auc: 0.76264
[92]	valid_0's auc: 0.76264
[93]	valid_0's auc: 0.76264
[94]	valid_0's auc: 0.76264
[95]	valid_0's auc: 0.762562
[96]	valid_0's auc: 0.76264
[97]	valid_0's auc: 0.76264
[98]	valid_0's auc: 0.76264
[99]	valid_0's auc: 0.76264
[100]	valid_0's auc: 0.76264
[1]	valid_0's auc: 0.760477
[2]	valid_0's auc: 0.760477
[3]	valid_0's auc: 0.760477
[4]	valid_0's auc: 0.760477
[5]	valid_0's auc: 0.760477
[6]	valid_0's auc: 0.760477
[7]	valid_0's auc: 0.760477
[8]	valid_0's auc: 0.760477
[9]	valid_0's auc: 0.760477
[10]	valid_0's auc: 0.760477
[11]	valid_0's auc: 0.760477
[12]	valid_0's auc: 0.760477
[13]	valid_0's auc: 0.760477
[14]	valid_0's auc: 0.760477
[15]	valid_0's auc: 0.760477
[16]	valid_0's auc: 0.760477
[17]	valid_0's auc: 0.760477
[18]	valid_0's auc: 0.760477
[19]	valid_0's auc: 0.760477
[20]	valid_0's auc: 0.760477
[21]	valid_0's auc: 0.760477
[22]	valid_0's auc: 0.760477
[23]	valid_0's auc: 0.760477
[24]	valid_0's auc: 0.760477
[25]	valid_0's auc: 0.760477
[26]	valid_0's auc: 0.760477
[27]	valid_0's auc: 0.760477
[28]	valid_0's auc: 0.760477
[29]	valid_0's auc: 0.760477
[30]	valid_0's auc: 0.760477
[31]	valid_0's auc: 0.760477
[32]	valid_0's auc: 0.760477
[33]	valid_0's auc: 0.760477
[34]	valid_0's auc: 0.760477
[35]	valid_0's auc: 0.760477
[36]	valid_0's auc: 0.760477
[37]	valid_0's auc: 0.760477
[38]	valid_0's auc: 0.760477
[39]	valid_0's auc: 0.760477
[40]	valid_0's auc: 0.760477
[41]	valid_0's auc: 0.760477
[42]	valid_0's auc: 0.760477
[43]	valid_0's auc: 0.760477
[44]	valid_0's auc: 0.760477
[45]	valid_0's auc: 0.760477
[46]	valid_0's auc: 0.760477
[47]	valid_0's auc: 0.760477
[48]	valid_0's auc: 0.760477
[49]	valid_0's auc: 0.760477
[50]	valid_0's auc: 0.760477
[51]	valid_0's auc: 0.760477
[52]	valid_0's auc: 0.760477
[53]	valid_0's auc: 0.760477
[54]	valid_0's auc: 0.760477
[55]	valid_0's auc: 0.760477
[56]	valid_0's auc: 0.760477
[57]	valid_0's auc: 0.760477
[58]	valid_0's auc: 0.760477
[59]	valid_0's auc: 0.760477
[60]	valid_0's auc: 0.760477
[61]	valid_0's auc: 0.760477
[62]	valid_0's auc: 0.760477
[63]	valid_0's auc: 0.760477
[64]	valid_0's auc: 0.760477
[65]	valid_0's auc: 0.760477
[66]	valid_0's auc: 0.760477
[67]	valid_0's auc: 0.760477
[68]	valid_0's auc: 0.760477
[69]	valid_0's auc: 0.760477
[70]	valid_0's auc: 0.760477
[71]	valid_0's auc: 0.760477
[72]	valid_0's auc: 0.760477
[73]	valid_0's auc: 0.760477
[74]	valid_0's auc: 0.760477
[75]	valid_0's auc: 0.760477
[76]	valid_0's auc: 0.760477
[77]	valid_0's auc: 0.760477
[78]	valid_0's auc: 0.760477
[79]	valid_0's auc: 0.760477
[80]	valid_0's auc: 0.760477
[81]	valid_0's auc: 0.760477
[82]	valid_0's auc: 0.760477
[83]	valid_0's auc: 0.760477
[84]	valid_0's auc: 0.760477
[85]	valid_0's auc: 0.760477
[86]	valid_0's auc: 0.760477
[87]	valid_0's auc: 0.760477
[88]	valid_0's auc: 0.760477
[89]	valid_0's auc: 0.760477
[90]	valid_0's auc: 0.760477
[91]	valid_0's auc: 0.760477
[92]	valid_0's auc: 0.760477
[93]	valid_0's auc: 0.760477
[94]	valid_0's auc: 0.760477
[95]	valid_0's auc: 0.760477
[96]	valid_0's auc: 0.760477
[97]	valid_0's auc: 0.760477
[98]	valid_0's auc: 0.760477
[99]	valid_0's auc: 0.760477
[100]	valid_0's auc: 0.760477
[1]	valid_0's auc: 0.7495
[2]	valid_0's auc: 0.7495
[3]	valid_0's auc: 0.7495
[4]	valid_0's auc: 0.7495
[5]	valid_0's auc: 0.7495
[6]	valid_0's auc: 0.7495
[7]	valid_0's auc: 0.7495
[8]	valid_0's auc: 0.7495
[9]	valid_0's auc: 0.7495
[10]	valid_0's auc: 0.7495
[11]	valid_0's auc: 0.7495
[12]	valid_0's auc: 0.7495
[13]	valid_0's auc: 0.7495
[14]	valid_0's auc: 0.7495
[15]	valid_0's auc: 0.7495
[16]	valid_0's auc: 0.7495
[17]	valid_0's auc: 0.7495
[18]	valid_0's auc: 0.7495
[19]	valid_0's auc: 0.7495
[20]	valid_0's auc: 0.7495
[21]	valid_0's auc: 0.7495
[22]	valid_0's auc: 0.7495
[23]	valid_0's auc: 0.7495
[24]	valid_0's auc: 0.7495
[25]	valid_0's auc: 0.7495
[26]	valid_0's auc: 0.7495
[27]	valid_0's auc: 0.7495
[28]	valid_0's auc: 0.7495
[29]	valid_0's auc: 0.7495
[30]	valid_0's auc: 0.7495
[31]	valid_0's auc: 0.7495
[32]	valid_0's auc: 0.7495
[33]	valid_0's auc: 0.7495
[34]	valid_0's auc: 0.7495
[35]	valid_0's auc: 0.7495
[36]	valid_0's auc: 0.7495
[37]	valid_0's auc: 0.7495
[38]	valid_0's auc: 0.7495
[39]	valid_0's auc: 0.7495
[40]	valid_0's auc: 0.7495
[41]	valid_0's auc: 0.7495
[42]	valid_0's auc: 0.7495
[43]	valid_0's auc: 0.7495
[44]	valid_0's auc: 0.7495
[45]	valid_0's auc: 0.7495
[46]	valid_0's auc: 0.7495
[47]	valid_0's auc: 0.7495
[48]	valid_0's auc: 0.7495
[49]	valid_0's auc: 0.7495
[50]	valid_0's auc: 0.7495
[51]	valid_0's auc: 0.7495
[52]	valid_0's auc: 0.7495
[53]	valid_0's auc: 0.7495
[54]	valid_0's auc: 0.7495
[55]	valid_0's auc: 0.7495
[56]	valid_0's auc: 0.7495
[57]	valid_0's auc: 0.7495
[58]	valid_0's auc: 0.7495
[59]	valid_0's auc: 0.7495
[60]	valid_0's auc: 0.7495
[61]	valid_0's auc: 0.7495
[62]	valid_0's auc: 0.7495
[63]	valid_0's auc: 0.7495
[64]	valid_0's auc: 0.7495
[65]	valid_0's auc: 0.7495
[66]	valid_0's auc: 0.7495
[67]	valid_0's auc: 0.7495
[68]	valid_0's auc: 0.7495
[69]	valid_0's auc: 0.7495
[70]	valid_0's auc: 0.7495
[71]	valid_0's auc: 0.7495
[72]	valid_0's auc: 0.7495
[73]	valid_0's auc: 0.7495
[74]	valid_0's auc: 0.7495
[75]	valid_0's auc: 0.7495
[76]	valid_0's auc: 0.7495
[77]	valid_0's auc: 0.7495
[78]	valid_0's auc: 0.7495
[79]	valid_0's auc: 0.7495
[80]	valid_0's auc: 0.7495
[81]	valid_0's auc: 0.7495
[82]	valid_0's auc: 0.7495
[83]	valid_0's auc: 0.7495
[84]	valid_0's auc: 0.7495
[85]	valid_0's auc: 0.7495
[86]	valid_0's auc: 0.7495
[87]	valid_0's auc: 0.7495
[88]	valid_0's auc: 0.7495
[89]	valid_0's auc: 0.7495
[90]	valid_0's auc: 0.7495
[91]	valid_0's auc: 0.7495
[92]	valid_0's auc: 0.7495
[93]	valid_0's auc: 0.7495
[94]	valid_0's auc: 0.7495
[95]	valid_0's auc: 0.7495
[96]	valid_0's auc: 0.7495
[97]	valid_0's auc: 0.7495
[98]	valid_0's auc: 0.7495
[99]	valid_0's auc: 0.7495
[100]	valid_0's auc: 0.7495
[1]	valid_0's auc: 0.748453
[2]	valid_0's auc: 0.748453
[3]	valid_0's auc: 0.748453
[4]	valid_0's auc: 0.748453
[5]	valid_0's auc: 0.748453
[6]	valid_0's auc: 0.748453
[7]	valid_0's auc: 0.748453
[8]	valid_0's auc: 0.748453
[9]	valid_0's auc: 0.748453
[10]	valid_0's auc: 0.748453
[11]	valid_0's auc: 0.748453
[12]	valid_0's auc: 0.748453
[13]	valid_0's auc: 0.748453
[14]	valid_0's auc: 0.748453
[15]	valid_0's auc: 0.748453
[16]	valid_0's auc: 0.748453
[17]	valid_0's auc: 0.748453
[18]	valid_0's auc: 0.748453
[19]	valid_0's auc: 0.748453
[20]	valid_0's auc: 0.748453
[21]	valid_0's auc: 0.748453
[22]	valid_0's auc: 0.748453
[23]	valid_0's auc: 0.748453
[24]	valid_0's auc: 0.748453
[25]	valid_0's auc: 0.748453
[26]	valid_0's auc: 0.748453
[27]	valid_0's auc: 0.748453
[28]	valid_0's auc: 0.748453
[29]	valid_0's auc: 0.748453
[30]	valid_0's auc: 0.748453
[31]	valid_0's auc: 0.748453
[32]	valid_0's auc: 0.748453
[33]	valid_0's auc: 0.748453
[34]	valid_0's auc: 0.748453
[35]	valid_0's auc: 0.748453
[36]	valid_0's auc: 0.748453
[37]	valid_0's auc: 0.748453
[38]	valid_0's auc: 0.748453
[39]	valid_0's auc: 0.748453
[40]	valid_0's auc: 0.748453
[41]	valid_0's auc: 0.748453
[42]	valid_0's auc: 0.748453
[43]	valid_0's auc: 0.748453
[44]	valid_0's auc: 0.748453
[45]	valid_0's auc: 0.748453
[46]	valid_0's auc: 0.748453
[47]	valid_0's auc: 0.748453
[48]	valid_0's auc: 0.748453
[49]	valid_0's auc: 0.748453
[50]	valid_0's auc: 0.748453
[51]	valid_0's auc: 0.748453
[52]	valid_0's auc: 0.748453
[53]	valid_0's auc: 0.748453
[54]	valid_0's auc: 0.748453
[55]	valid_0's auc: 0.748453
[56]	valid_0's auc: 0.748453
[57]	valid_0's auc: 0.748453
[58]	valid_0's auc: 0.748453
[59]	valid_0's auc: 0.748453
[60]	valid_0's auc: 0.748453
[61]	valid_0's auc: 0.748453
[62]	valid_0's auc: 0.748453
[63]	valid_0's auc: 0.748453
[64]	valid_0's auc: 0.748453
[65]	valid_0's auc: 0.748453
[66]	valid_0's auc: 0.748453
[67]	valid_0's auc: 0.748453
[68]	valid_0's auc: 0.748453
[69]	valid_0's auc: 0.748453
[70]	valid_0's auc: 0.748453
[71]	valid_0's auc: 0.748453
[72]	valid_0's auc: 0.748453
[73]	valid_0's auc: 0.748453
[74]	valid_0's auc: 0.748453
[75]	valid_0's auc: 0.748453
[76]	valid_0's auc: 0.748453
[77]	valid_0's auc: 0.748453
[78]	valid_0's auc: 0.748453
[79]	valid_0's auc: 0.748453
[80]	valid_0's auc: 0.748453
[81]	valid_0's auc: 0.748453
[82]	valid_0's auc: 0.748453
[83]	valid_0's auc: 0.748453
[84]	valid_0's auc: 0.748453
[85]	valid_0's auc: 0.748453
[86]	valid_0's auc: 0.748453
[87]	valid_0's auc: 0.748453
[88]	valid_0's auc: 0.748453
[89]	valid_0's auc: 0.748453
[90]	valid_0's auc: 0.748453
[91]	valid_0's auc: 0.748453
[92]	valid_0's auc: 0.748453
[93]	valid_0's auc: 0.748453
[94]	valid_0's auc: 0.748453
[95]	valid_0's auc: 0.748453
[96]	valid_0's auc: 0.748453
[97]	valid_0's auc: 0.748453
[98]	valid_0's auc: 0.749386
[99]	valid_0's auc: 0.749386
[100]	valid_0's auc: 0.749386
[1]	valid_0's auc: 0.748867
[2]	valid_0's auc: 0.748867
[3]	valid_0's auc: 0.748867
[4]	valid_0's auc: 0.748867
[5]	valid_0's auc: 0.748867
[6]	valid_0's auc: 0.748867
[7]	valid_0's auc: 0.748867
[8]	valid_0's auc: 0.748867
[9]	valid_0's auc: 0.748867
[10]	valid_0's auc: 0.748867
[11]	valid_0's auc: 0.748867
[12]	valid_0's auc: 0.748867
[13]	valid_0's auc: 0.748867
[14]	valid_0's auc: 0.748867
[15]	valid_0's auc: 0.748867
[16]	valid_0's auc: 0.748867
[17]	valid_0's auc: 0.748867
[18]	valid_0's auc: 0.748867
[19]	valid_0's auc: 0.748867
[20]	valid_0's auc: 0.748867
[21]	valid_0's auc: 0.748867
[22]	valid_0's auc: 0.748867
[23]	valid_0's auc: 0.748867
[24]	valid_0's auc: 0.748867
[25]	valid_0's auc: 0.748867
[26]	valid_0's auc: 0.748867
[27]	valid_0's auc: 0.748867
[28]	valid_0's auc: 0.748867
[29]	valid_0's auc: 0.748867
[30]	valid_0's auc: 0.748867
[31]	valid_0's auc: 0.748867
[32]	valid_0's auc: 0.748867
[33]	valid_0's auc: 0.748867
[34]	valid_0's auc: 0.748867
[35]	valid_0's auc: 0.748867
[36]	valid_0's auc: 0.748867
[37]	valid_0's auc: 0.748867
[38]	valid_0's auc: 0.748867
[39]	valid_0's auc: 0.748867
[40]	valid_0's auc: 0.748867
[41]	valid_0's auc: 0.748867
[42]	valid_0's auc: 0.748867
[43]	valid_0's auc: 0.748867
[44]	valid_0's auc: 0.748867
[45]	valid_0's auc: 0.748867
[46]	valid_0's auc: 0.748867
[47]	valid_0's auc: 0.748867
[48]	valid_0's auc: 0.748867
[49]	valid_0's auc: 0.748867
[50]	valid_0's auc: 0.748867
[51]	valid_0's auc: 0.748867
[52]	valid_0's auc: 0.748867
[53]	valid_0's auc: 0.748867
[54]	valid_0's auc: 0.748867
[55]	valid_0's auc: 0.748867
[56]	valid_0's auc: 0.757715
[57]	valid_0's auc: 0.757715
[58]	valid_0's auc: 0.757715
[59]	valid_0's auc: 0.757715
[60]	valid_0's auc: 0.757715
[61]	valid_0's auc: 0.757715
[62]	valid_0's auc: 0.757831
[63]	valid_0's auc: 0.757831
[64]	valid_0's auc: 0.757831
[65]	valid_0's auc: 0.757831
[66]	valid_0's auc: 0.757831
[67]	valid_0's auc: 0.757831
[68]	valid_0's auc: 0.757831
[69]	valid_0's auc: 0.757831
[70]	valid_0's auc: 0.757831
[71]	valid_0's auc: 0.757831
[72]	valid_0's auc: 0.757899
[73]	valid_0's auc: 0.757899
[74]	valid_0's auc: 0.757899
[75]	valid_0's auc: 0.75784
[76]	valid_0's auc: 0.75784
[77]	valid_0's auc: 0.75784
[78]	valid_0's auc: 0.757942
[79]	valid_0's auc: 0.758636
[80]	valid_0's auc: 0.758669
[81]	valid_0's auc: 0.758661
[82]	valid_0's auc: 0.759759
[83]	valid_0's auc: 0.759759
[84]	valid_0's auc: 0.760099
[85]	valid_0's auc: 0.759989
[86]	valid_0's auc: 0.759989
[87]	valid_0's auc: 0.759989
[88]	valid_0's auc: 0.759989
[89]	valid_0's auc: 0.759682
[90]	valid_0's auc: 0.759989
[91]	valid_0's auc: 0.759682
[92]	valid_0's auc: 0.759893
[93]	valid_0's auc: 0.760097
[94]	valid_0's auc: 0.760097
[95]	valid_0's auc: 0.760099
[96]	valid_0's auc: 0.76013
[97]	valid_0's auc: 0.76013
[98]	valid_0's auc: 0.760099
[99]	valid_0's auc: 0.760099
[100]	valid_0's auc: 0.760164
[1]	valid_0's auc: 0.748286
[2]	valid_0's auc: 0.748286
[3]	valid_0's auc: 0.748286
[4]	valid_0's auc: 0.748286
[5]	valid_0's auc: 0.748286
[6]	valid_0's auc: 0.748286
[7]	valid_0's auc: 0.748286
[8]	valid_0's auc: 0.748286
[9]	valid_0's auc: 0.748286
[10]	valid_0's auc: 0.748286
[11]	valid_0's auc: 0.748286
[12]	valid_0's auc: 0.748286
[13]	valid_0's auc: 0.748286
[14]	valid_0's auc: 0.748286
[15]	valid_0's auc: 0.748286
[16]	valid_0's auc: 0.748286
[17]	valid_0's auc: 0.748286
[18]	valid_0's auc: 0.748286
[19]	valid_0's auc: 0.748286
[20]	valid_0's auc: 0.748286
[21]	valid_0's auc: 0.748286
[22]	valid_0's auc: 0.748286
[23]	valid_0's auc: 0.748286
[24]	valid_0's auc: 0.748286
[25]	valid_0's auc: 0.748286
[26]	valid_0's auc: 0.748286
[27]	valid_0's auc: 0.748286
[28]	valid_0's auc: 0.748286
[29]	valid_0's auc: 0.748286
[30]	valid_0's auc: 0.748286
[31]	valid_0's auc: 0.748286
[32]	valid_0's auc: 0.748286
[33]	valid_0's auc: 0.748286
[34]	valid_0's auc: 0.748286
[35]	valid_0's auc: 0.748286
[36]	valid_0's auc: 0.748286
[37]	valid_0's auc: 0.748286
[38]	valid_0's auc: 0.748286
[39]	valid_0's auc: 0.748286
[40]	valid_0's auc: 0.748286
[41]	valid_0's auc: 0.748286
[42]	valid_0's auc: 0.748286
[43]	valid_0's auc: 0.748286
[44]	valid_0's auc: 0.748286
[45]	valid_0's auc: 0.748286
[46]	valid_0's auc: 0.748286
[47]	valid_0's auc: 0.748286
[48]	valid_0's auc: 0.748286
[49]	valid_0's auc: 0.748286
[50]	valid_0's auc: 0.748286
[51]	valid_0's auc: 0.748286
[52]	valid_0's auc: 0.748286
[53]	valid_0's auc: 0.748286
[54]	valid_0's auc: 0.748286
[55]	valid_0's auc: 0.748286
[56]	valid_0's auc: 0.748286
[57]	valid_0's auc: 0.748286
[58]	valid_0's auc: 0.748286
[59]	valid_0's auc: 0.748286
[60]	valid_0's auc: 0.748286
[61]	valid_0's auc: 0.748286
[62]	valid_0's auc: 0.748286
[63]	valid_0's auc: 0.748286
[64]	valid_0's auc: 0.748286
[65]	valid_0's auc: 0.748286
[66]	valid_0's auc: 0.748286
[67]	valid_0's auc: 0.748286
[68]	valid_0's auc: 0.748286
[69]	valid_0's auc: 0.748286
[70]	valid_0's auc: 0.748286
[71]	valid_0's auc: 0.748286
[72]	valid_0's auc: 0.748286
[73]	valid_0's auc: 0.748286
[74]	valid_0's auc: 0.748286
[75]	valid_0's auc: 0.748286
[76]	valid_0's auc: 0.748286
[77]	valid_0's auc: 0.748286
[78]	valid_0's auc: 0.748286
[79]	valid_0's auc: 0.748286
[80]	valid_0's auc: 0.748286
[81]	valid_0's auc: 0.748286
[82]	valid_0's auc: 0.748286
[83]	valid_0's auc: 0.748286
[84]	valid_0's auc: 0.748286
[85]	valid_0's auc: 0.748286
[86]	valid_0's auc: 0.748286
[87]	valid_0's auc: 0.748286
[88]	valid_0's auc: 0.748286
[89]	valid_0's auc: 0.748286
[90]	valid_0's auc: 0.748286
[91]	valid_0's auc: 0.748286
[92]	valid_0's auc: 0.748286
[93]	valid_0's auc: 0.748286
[94]	valid_0's auc: 0.748286
[95]	valid_0's auc: 0.748286
[96]	valid_0's auc: 0.748286
[97]	valid_0's auc: 0.748286
[98]	valid_0's auc: 0.748286
[99]	valid_0's auc: 0.748286
[100]	valid_0's auc: 0.748286
[1]	valid_0's auc: 0.757043
[2]	valid_0's auc: 0.757043
[3]	valid_0's auc: 0.757043
[4]	valid_0's auc: 0.757043
[5]	valid_0's auc: 0.757043
[6]	valid_0's auc: 0.757043
[7]	valid_0's auc: 0.757043
[8]	valid_0's auc: 0.757043
[9]	valid_0's auc: 0.757043
[10]	valid_0's auc: 0.757043
[11]	valid_0's auc: 0.757043
[12]	valid_0's auc: 0.757043
[13]	valid_0's auc: 0.757043
[14]	valid_0's auc: 0.757043
[15]	valid_0's auc: 0.757043
[16]	valid_0's auc: 0.757043
[17]	valid_0's auc: 0.757043
[18]	valid_0's auc: 0.757043
[19]	valid_0's auc: 0.757043
[20]	valid_0's auc: 0.757043
[21]	valid_0's auc: 0.757043
[22]	valid_0's auc: 0.757043
[23]	valid_0's auc: 0.757043
[24]	valid_0's auc: 0.757043
[25]	valid_0's auc: 0.757043
[26]	valid_0's auc: 0.757043
[27]	valid_0's auc: 0.757043
[28]	valid_0's auc: 0.757043
[29]	valid_0's auc: 0.757043
[30]	valid_0's auc: 0.757043
[31]	valid_0's auc: 0.757043
[32]	valid_0's auc: 0.757043
[33]	valid_0's auc: 0.757043
[34]	valid_0's auc: 0.757043
[35]	valid_0's auc: 0.757043
[36]	valid_0's auc: 0.757043
[37]	valid_0's auc: 0.757043
[38]	valid_0's auc: 0.757043
[39]	valid_0's auc: 0.757043
[40]	valid_0's auc: 0.757043
[41]	valid_0's auc: 0.757043
[42]	valid_0's auc: 0.757043
[43]	valid_0's auc: 0.757043
[44]	valid_0's auc: 0.757043
[45]	valid_0's auc: 0.757043
[46]	valid_0's auc: 0.757043
[47]	valid_0's auc: 0.757043
[48]	valid_0's auc: 0.757043
[49]	valid_0's auc: 0.757043
[50]	valid_0's auc: 0.757043
[51]	valid_0's auc: 0.757043
[52]	valid_0's auc: 0.757043
[53]	valid_0's auc: 0.757043
[54]	valid_0's auc: 0.757043
[55]	valid_0's auc: 0.757043
[56]	valid_0's auc: 0.757043
[57]	valid_0's auc: 0.757043
[58]	valid_0's auc: 0.757043
[59]	valid_0's auc: 0.757043
[60]	valid_0's auc: 0.757043
[61]	valid_0's auc: 0.757043
[62]	valid_0's auc: 0.757043
[63]	valid_0's auc: 0.757043
[64]	valid_0's auc: 0.757043
[65]	valid_0's auc: 0.757043
[66]	valid_0's auc: 0.757043
[67]	valid_0's auc: 0.757043
[68]	valid_0's auc: 0.757043
[69]	valid_0's auc: 0.757043
[70]	valid_0's auc: 0.757043
[71]	valid_0's auc: 0.757043
[72]	valid_0's auc: 0.757043
[73]	valid_0's auc: 0.757043
[74]	valid_0's auc: 0.757043
[75]	valid_0's auc: 0.757043
[76]	valid_0's auc: 0.757043
[77]	valid_0's auc: 0.757043
[78]	valid_0's auc: 0.757043
[79]	valid_0's auc: 0.757043
[80]	valid_0's auc: 0.757043
[81]	valid_0's auc: 0.757043
[82]	valid_0's auc: 0.757043
[83]	valid_0's auc: 0.757043
[84]	valid_0's auc: 0.757043
[85]	valid_0's auc: 0.757043
[86]	valid_0's auc: 0.757043
[87]	valid_0's auc: 0.757043
[88]	valid_0's auc: 0.757043
[89]	valid_0's auc: 0.757043
[90]	valid_0's auc: 0.757043
[91]	valid_0's auc: 0.757043
[92]	valid_0's auc: 0.757043
[93]	valid_0's auc: 0.757043
[94]	valid_0's auc: 0.757043
[95]	valid_0's auc: 0.757043
[96]	valid_0's auc: 0.757043
[97]	valid_0's auc: 0.757043
[98]	valid_0's auc: 0.757043
[99]	valid_0's auc: 0.757043
[100]	valid_0's auc: 0.757043
[1]	valid_0's auc: 0.752298
[2]	valid_0's auc: 0.752298
[3]	valid_0's auc: 0.752298
[4]	valid_0's auc: 0.752298
[5]	valid_0's auc: 0.752298
[6]	valid_0's auc: 0.752298
[7]	valid_0's auc: 0.752298
[8]	valid_0's auc: 0.752298
[9]	valid_0's auc: 0.752298
[10]	valid_0's auc: 0.752298
[11]	valid_0's auc: 0.752298
[12]	valid_0's auc: 0.752298
[13]	valid_0's auc: 0.752298
[14]	valid_0's auc: 0.752298
[15]	valid_0's auc: 0.752298
[16]	valid_0's auc: 0.752298
[17]	valid_0's auc: 0.752298
[18]	valid_0's auc: 0.752298
[19]	valid_0's auc: 0.752298
[20]	valid_0's auc: 0.752298
[21]	valid_0's auc: 0.752298
[22]	valid_0's auc: 0.752298
[23]	valid_0's auc: 0.752298
[24]	valid_0's auc: 0.752298
[25]	valid_0's auc: 0.752298
[26]	valid_0's auc: 0.752298
[27]	valid_0's auc: 0.752298
[28]	valid_0's auc: 0.752298
[29]	valid_0's auc: 0.752298
[30]	valid_0's auc: 0.752298
[31]	valid_0's auc: 0.752298
[32]	valid_0's auc: 0.752298
[33]	valid_0's auc: 0.752298
[34]	valid_0's auc: 0.752298
[35]	valid_0's auc: 0.752298
[36]	valid_0's auc: 0.752298
[37]	valid_0's auc: 0.752298
[38]	valid_0's auc: 0.752298
[39]	valid_0's auc: 0.752298
[40]	valid_0's auc: 0.752303
[41]	valid_0's auc: 0.752303
[42]	valid_0's auc: 0.752303
[43]	valid_0's auc: 0.752303
[44]	valid_0's auc: 0.752303
[45]	valid_0's auc: 0.752303
[46]	valid_0's auc: 0.752303
[47]	valid_0's auc: 0.752303
[48]	valid_0's auc: 0.752303
[49]	valid_0's auc: 0.752662
[50]	valid_0's auc: 0.752662
[51]	valid_0's auc: 0.752662
[52]	valid_0's auc: 0.752662
[53]	valid_0's auc: 0.752662
[54]	valid_0's auc: 0.752768
[55]	valid_0's auc: 0.752768
[56]	valid_0's auc: 0.752768
[57]	valid_0's auc: 0.752768
[58]	valid_0's auc: 0.752768
[59]	valid_0's auc: 0.752768
[60]	valid_0's auc: 0.752768
[61]	valid_0's auc: 0.752768
[62]	valid_0's auc: 0.752768
[63]	valid_0's auc: 0.752768
[64]	valid_0's auc: 0.752768
[65]	valid_0's auc: 0.752768
[66]	valid_0's auc: 0.752768
[67]	valid_0's auc: 0.752768
[68]	valid_0's auc: 0.752768
[69]	valid_0's auc: 0.752768
[70]	valid_0's auc: 0.752768
[71]	valid_0's auc: 0.752768
[72]	valid_0's auc: 0.752768
[73]	valid_0's auc: 0.752768
[74]	valid_0's auc: 0.752768
[75]	valid_0's auc: 0.752768
[76]	valid_0's auc: 0.752768
[77]	valid_0's auc: 0.757295
[78]	valid_0's auc: 0.757295
[79]	valid_0's auc: 0.757295
[80]	valid_0's auc: 0.757295
[81]	valid_0's auc: 0.757295
[82]	valid_0's auc: 0.757295
[83]	valid_0's auc: 0.757295
[84]	valid_0's auc: 0.757295
[85]	valid_0's auc: 0.757295
[86]	valid_0's auc: 0.757295
[87]	valid_0's auc: 0.757295
[88]	valid_0's auc: 0.757295
[89]	valid_0's auc: 0.757295
[90]	valid_0's auc: 0.757295
[91]	valid_0's auc: 0.757295
[92]	valid_0's auc: 0.757295
[93]	valid_0's auc: 0.757295
[94]	valid_0's auc: 0.757295
[95]	valid_0's auc: 0.757295
[96]	valid_0's auc: 0.757295
[97]	valid_0's auc: 0.757295
[98]	valid_0's auc: 0.757295
[99]	valid_0's auc: 0.757295
[100]	valid_0's auc: 0.757295
[1]	valid_0's auc: 0.754636
[2]	valid_0's auc: 0.754636
[3]	valid_0's auc: 0.754636
[4]	valid_0's auc: 0.754636
[5]	valid_0's auc: 0.754636
[6]	valid_0's auc: 0.754636
[7]	valid_0's auc: 0.754636
[8]	valid_0's auc: 0.754636
[9]	valid_0's auc: 0.754636
[10]	valid_0's auc: 0.754636
[11]	valid_0's auc: 0.754636
[12]	valid_0's auc: 0.754636
[13]	valid_0's auc: 0.754636
[14]	valid_0's auc: 0.754636
[15]	valid_0's auc: 0.754636
[16]	valid_0's auc: 0.754636
[17]	valid_0's auc: 0.754636
[18]	valid_0's auc: 0.754636
[19]	valid_0's auc: 0.754636
[20]	valid_0's auc: 0.754636
[21]	valid_0's auc: 0.754636
[22]	valid_0's auc: 0.754636
[23]	valid_0's auc: 0.754636
[24]	valid_0's auc: 0.754636
[25]	valid_0's auc: 0.754636
[26]	valid_0's auc: 0.754636
[27]	valid_0's auc: 0.754636
[28]	valid_0's auc: 0.754636
[29]	valid_0's auc: 0.754636
[30]	valid_0's auc: 0.754636
[31]	valid_0's auc: 0.754636
[32]	valid_0's auc: 0.754636
[33]	valid_0's auc: 0.754636
[34]	valid_0's auc: 0.754636
[35]	valid_0's auc: 0.754636
[36]	valid_0's auc: 0.754636
[37]	valid_0's auc: 0.754636
[38]	valid_0's auc: 0.754636
[39]	valid_0's auc: 0.754636
[40]	valid_0's auc: 0.754636
[41]	valid_0's auc: 0.754636
[42]	valid_0's auc: 0.754636
[43]	valid_0's auc: 0.754636
[44]	valid_0's auc: 0.754636
[45]	valid_0's auc: 0.754636
[46]	valid_0's auc: 0.754636
[47]	valid_0's auc: 0.754636
[48]	valid_0's auc: 0.754636
[49]	valid_0's auc: 0.754636
[50]	valid_0's auc: 0.754636
[51]	valid_0's auc: 0.754636
[52]	valid_0's auc: 0.754636
[53]	valid_0's auc: 0.754636
[54]	valid_0's auc: 0.754636
[55]	valid_0's auc: 0.754636
[56]	valid_0's auc: 0.754636
[57]	valid_0's auc: 0.754636
[58]	valid_0's auc: 0.754636
[59]	valid_0's auc: 0.754636
[60]	valid_0's auc: 0.754636
[61]	valid_0's auc: 0.754636
[62]	valid_0's auc: 0.754636
[63]	valid_0's auc: 0.754636
[64]	valid_0's auc: 0.754636
[65]	valid_0's auc: 0.754636
[66]	valid_0's auc: 0.754636
[67]	valid_0's auc: 0.754636
[68]	valid_0's auc: 0.754636
[69]	valid_0's auc: 0.754636
[70]	valid_0's auc: 0.754636
[71]	valid_0's auc: 0.754636
[72]	valid_0's auc: 0.754636
[73]	valid_0's auc: 0.754636
[74]	valid_0's auc: 0.754636
[75]	valid_0's auc: 0.754636
[76]	valid_0's auc: 0.754636
[77]	valid_0's auc: 0.754636
[78]	valid_0's auc: 0.754636
[79]	valid_0's auc: 0.754636
[80]	valid_0's auc: 0.754636
[81]	valid_0's auc: 0.754636
[82]	valid_0's auc: 0.754636
[83]	valid_0's auc: 0.754636
[84]	valid_0's auc: 0.754636
[85]	valid_0's auc: 0.754636
[86]	valid_0's auc: 0.754636
[87]	valid_0's auc: 0.754636
[88]	valid_0's auc: 0.754636
[89]	valid_0's auc: 0.754636
[90]	valid_0's auc: 0.754636
[91]	valid_0's auc: 0.754636
[92]	valid_0's auc: 0.754636
[93]	valid_0's auc: 0.754636
[94]	valid_0's auc: 0.754636
[95]	valid_0's auc: 0.754636
[96]	valid_0's auc: 0.754636
[97]	valid_0's auc: 0.754636
[98]	valid_0's auc: 0.754636
[99]	valid_0's auc: 0.754636
[100]	valid_0's auc: 0.755244
[1]	valid_0's auc: 0.757698
[2]	valid_0's auc: 0.757698
[3]	valid_0's auc: 0.757698
[4]	valid_0's auc: 0.757698
[5]	valid_0's auc: 0.757698
[6]	valid_0's auc: 0.757698
[7]	valid_0's auc: 0.757698
[8]	valid_0's auc: 0.757698
[9]	valid_0's auc: 0.757698
[10]	valid_0's auc: 0.757698
[11]	valid_0's auc: 0.757698
[12]	valid_0's auc: 0.757698
[13]	valid_0's auc: 0.757698
[14]	valid_0's auc: 0.757698
[15]	valid_0's auc: 0.757698
[16]	valid_0's auc: 0.757698
[17]	valid_0's auc: 0.757698
[18]	valid_0's auc: 0.757698
[19]	valid_0's auc: 0.757698
[20]	valid_0's auc: 0.757698
[21]	valid_0's auc: 0.757698
[22]	valid_0's auc: 0.757698
[23]	valid_0's auc: 0.757698
[24]	valid_0's auc: 0.757698
[25]	valid_0's auc: 0.757698
[26]	valid_0's auc: 0.757698
[27]	valid_0's auc: 0.757698
[28]	valid_0's auc: 0.757698
[29]	valid_0's auc: 0.757698
[30]	valid_0's auc: 0.757698
[31]	valid_0's auc: 0.757698
[32]	valid_0's auc: 0.757698
[33]	valid_0's auc: 0.757698
[34]	valid_0's auc: 0.757698
[35]	valid_0's auc: 0.757698
[36]	valid_0's auc: 0.757698
[37]	valid_0's auc: 0.757698
[38]	valid_0's auc: 0.757698
[39]	valid_0's auc: 0.757698
[40]	valid_0's auc: 0.757698
[41]	valid_0's auc: 0.757698
[42]	valid_0's auc: 0.757698
[43]	valid_0's auc: 0.757698
[44]	valid_0's auc: 0.757698
[45]	valid_0's auc: 0.757698
[46]	valid_0's auc: 0.757698
[47]	valid_0's auc: 0.757698
[48]	valid_0's auc: 0.757698
[49]	valid_0's auc: 0.757698
[50]	valid_0's auc: 0.757698
[51]	valid_0's auc: 0.757698
[52]	valid_0's auc: 0.757698
[53]	valid_0's auc: 0.757698
[54]	valid_0's auc: 0.757698
[55]	valid_0's auc: 0.757698
[56]	valid_0's auc: 0.757698
[57]	valid_0's auc: 0.757698
[58]	valid_0's auc: 0.757698
[59]	valid_0's auc: 0.757698
[60]	valid_0's auc: 0.757698
[61]	valid_0's auc: 0.757698
[62]	valid_0's auc: 0.757698
[63]	valid_0's auc: 0.757698
[64]	valid_0's auc: 0.757698
[65]	valid_0's auc: 0.757698
[66]	valid_0's auc: 0.757698
[67]	valid_0's auc: 0.757698
[68]	valid_0's auc: 0.757698
[69]	valid_0's auc: 0.757698
[70]	valid_0's auc: 0.757698
[71]	valid_0's auc: 0.757698
[72]	valid_0's auc: 0.757698
[73]	valid_0's auc: 0.757698
[74]	valid_0's auc: 0.757698
[75]	valid_0's auc: 0.757698
[76]	valid_0's auc: 0.757698
[77]	valid_0's auc: 0.757698
[78]	valid_0's auc: 0.757698
[79]	valid_0's auc: 0.757698
[80]	valid_0's auc: 0.757698
[81]	valid_0's auc: 0.757698
[82]	valid_0's auc: 0.757698
[83]	valid_0's auc: 0.757698
[84]	valid_0's auc: 0.757698
[85]	valid_0's auc: 0.757698
[86]	valid_0's auc: 0.757698
[87]	valid_0's auc: 0.757698
[88]	valid_0's auc: 0.757698
[89]	valid_0's auc: 0.757698
[90]	valid_0's auc: 0.757698
[91]	valid_0's auc: 0.757698
[92]	valid_0's auc: 0.757698
[93]	valid_0's auc: 0.757698
[94]	valid_0's auc: 0.757698
[95]	valid_0's auc: 0.757698
[96]	valid_0's auc: 0.757698
[97]	valid_0's auc: 0.757698
[98]	valid_0's auc: 0.757698
[99]	valid_0's auc: 0.757698
[100]	valid_0's auc: 0.757698
[1]	valid_0's auc: 0.760154
[2]	valid_0's auc: 0.760154
[3]	valid_0's auc: 0.760154
[4]	valid_0's auc: 0.760154
[5]	valid_0's auc: 0.760154
[6]	valid_0's auc: 0.760154
[7]	valid_0's auc: 0.760154
[8]	valid_0's auc: 0.760154
[9]	valid_0's auc: 0.760154
[10]	valid_0's auc: 0.760154
[11]	valid_0's auc: 0.760154
[12]	valid_0's auc: 0.760154
[13]	valid_0's auc: 0.760154
[14]	valid_0's auc: 0.760154
[15]	valid_0's auc: 0.760154
[16]	valid_0's auc: 0.760154
[17]	valid_0's auc: 0.760154
[18]	valid_0's auc: 0.760154
[19]	valid_0's auc: 0.760154
[20]	valid_0's auc: 0.760154
[21]	valid_0's auc: 0.760154
[22]	valid_0's auc: 0.760154
[23]	valid_0's auc: 0.760154
[24]	valid_0's auc: 0.760154
[25]	valid_0's auc: 0.760154
[26]	valid_0's auc: 0.760154
[27]	valid_0's auc: 0.760154
[28]	valid_0's auc: 0.760154
[29]	valid_0's auc: 0.760154
[30]	valid_0's auc: 0.760154
[31]	valid_0's auc: 0.760154
[32]	valid_0's auc: 0.760154
[33]	valid_0's auc: 0.760154
[34]	valid_0's auc: 0.760154
[35]	valid_0's auc: 0.760154
[36]	valid_0's auc: 0.760154
[37]	valid_0's auc: 0.760154
[38]	valid_0's auc: 0.760154
[39]	valid_0's auc: 0.760154
[40]	valid_0's auc: 0.760154
[41]	valid_0's auc: 0.760154
[42]	valid_0's auc: 0.760154
[43]	valid_0's auc: 0.760154
[44]	valid_0's auc: 0.760154
[45]	valid_0's auc: 0.760154
[46]	valid_0's auc: 0.760154
[47]	valid_0's auc: 0.760154
[48]	valid_0's auc: 0.760154
[49]	valid_0's auc: 0.760154
[50]	valid_0's auc: 0.760154
[51]	valid_0's auc: 0.760154
[52]	valid_0's auc: 0.760154
[53]	valid_0's auc: 0.760154
[54]	valid_0's auc: 0.760154
[55]	valid_0's auc: 0.760154
[56]	valid_0's auc: 0.760154
[57]	valid_0's auc: 0.760154
[58]	valid_0's auc: 0.760154
[59]	valid_0's auc: 0.760154
[60]	valid_0's auc: 0.760154
[61]	valid_0's auc: 0.760154
[62]	valid_0's auc: 0.760154
[63]	valid_0's auc: 0.760154
[64]	valid_0's auc: 0.760154
[65]	valid_0's auc: 0.760154
[66]	valid_0's auc: 0.760154
[67]	valid_0's auc: 0.760154
[68]	valid_0's auc: 0.760154
[69]	valid_0's auc: 0.760154
[70]	valid_0's auc: 0.760154
[71]	valid_0's auc: 0.760154
[72]	valid_0's auc: 0.760154
[73]	valid_0's auc: 0.760154
[74]	valid_0's auc: 0.760154
[75]	valid_0's auc: 0.760154
[76]	valid_0's auc: 0.760154
[77]	valid_0's auc: 0.760154
[78]	valid_0's auc: 0.760154
[79]	valid_0's auc: 0.760154
[80]	valid_0's auc: 0.760154
[81]	valid_0's auc: 0.760154
[82]	valid_0's auc: 0.760154
[83]	valid_0's auc: 0.760154
[84]	valid_0's auc: 0.760154
[85]	valid_0's auc: 0.760154
[86]	valid_0's auc: 0.760154
[87]	valid_0's auc: 0.760154
[88]	valid_0's auc: 0.760154
[89]	valid_0's auc: 0.760154
[90]	valid_0's auc: 0.760154
[91]	valid_0's auc: 0.760154
[92]	valid_0's auc: 0.760154
[93]	valid_0's auc: 0.760154
[94]	valid_0's auc: 0.760154
[95]	valid_0's auc: 0.760154
[96]	valid_0's auc: 0.760243
[97]	valid_0's auc: 0.760243
[98]	valid_0's auc: 0.760243
[99]	valid_0's auc: 0.760243
[100]	valid_0's auc: 0.760243
[1]	valid_0's auc: 0.759786
[2]	valid_0's auc: 0.759786
[3]	valid_0's auc: 0.759786
[4]	valid_0's auc: 0.759786
[5]	valid_0's auc: 0.759786
[6]	valid_0's auc: 0.759786
[7]	valid_0's auc: 0.759786
[8]	valid_0's auc: 0.759786
[9]	valid_0's auc: 0.759786
[10]	valid_0's auc: 0.759786
[11]	valid_0's auc: 0.759786
[12]	valid_0's auc: 0.759786
[13]	valid_0's auc: 0.759786
[14]	valid_0's auc: 0.759786
[15]	valid_0's auc: 0.759786
[16]	valid_0's auc: 0.759786
[17]	valid_0's auc: 0.759786
[18]	valid_0's auc: 0.759786
[19]	valid_0's auc: 0.759786
[20]	valid_0's auc: 0.759786
[21]	valid_0's auc: 0.759786
[22]	valid_0's auc: 0.759786
[23]	valid_0's auc: 0.759786
[24]	valid_0's auc: 0.759786
[25]	valid_0's auc: 0.759786
[26]	valid_0's auc: 0.759786
[27]	valid_0's auc: 0.759786
[28]	valid_0's auc: 0.759786
[29]	valid_0's auc: 0.759786
[30]	valid_0's auc: 0.759786
[31]	valid_0's auc: 0.759786
[32]	valid_0's auc: 0.759786
[33]	valid_0's auc: 0.759786
[34]	valid_0's auc: 0.759786
[35]	valid_0's auc: 0.759786
[36]	valid_0's auc: 0.759786
[37]	valid_0's auc: 0.759786
[38]	valid_0's auc: 0.759786
[39]	valid_0's auc: 0.759786
[40]	valid_0's auc: 0.75979
[41]	valid_0's auc: 0.75979
[42]	valid_0's auc: 0.75979
[43]	valid_0's auc: 0.75979
[44]	valid_0's auc: 0.75979
[45]	valid_0's auc: 0.75979
[46]	valid_0's auc: 0.75979
[47]	valid_0's auc: 0.75979
[48]	valid_0's auc: 0.75979
[49]	valid_0's auc: 0.759833
[50]	valid_0's auc: 0.759833
[51]	valid_0's auc: 0.759833
[52]	valid_0's auc: 0.759833
[53]	valid_0's auc: 0.759833
[54]	valid_0's auc: 0.759939
[55]	valid_0's auc: 0.759939
[56]	valid_0's auc: 0.759939
[57]	valid_0's auc: 0.759939
[58]	valid_0's auc: 0.759939
[59]	valid_0's auc: 0.759939
[60]	valid_0's auc: 0.759939
[61]	valid_0's auc: 0.759939
[62]	valid_0's auc: 0.759939
[63]	valid_0's auc: 0.759939
[64]	valid_0's auc: 0.759939
[65]	valid_0's auc: 0.759939
[66]	valid_0's auc: 0.759939
[67]	valid_0's auc: 0.759939
[68]	valid_0's auc: 0.759939
[69]	valid_0's auc: 0.759939
[70]	valid_0's auc: 0.759939
[71]	valid_0's auc: 0.759939
[72]	valid_0's auc: 0.759939
[73]	valid_0's auc: 0.759939
[74]	valid_0's auc: 0.759939
[75]	valid_0's auc: 0.759939
[76]	valid_0's auc: 0.759939
[77]	valid_0's auc: 0.759939
[78]	valid_0's auc: 0.759939
[79]	valid_0's auc: 0.759939
[80]	valid_0's auc: 0.759939
[81]	valid_0's auc: 0.759939
[82]	valid_0's auc: 0.759939
[83]	valid_0's auc: 0.759939
[84]	valid_0's auc: 0.759939
[85]	valid_0's auc: 0.759939
[86]	valid_0's auc: 0.759939
[87]	valid_0's auc: 0.759939
[88]	valid_0's auc: 0.759939
[89]	valid_0's auc: 0.759939
[90]	valid_0's auc: 0.759939
[91]	valid_0's auc: 0.759939
[92]	valid_0's auc: 0.759939
[93]	valid_0's auc: 0.759939
[94]	valid_0's auc: 0.759939
[95]	valid_0's auc: 0.759939
[96]	valid_0's auc: 0.759939
[97]	valid_0's auc: 0.759939
[98]	valid_0's auc: 0.759939
[99]	valid_0's auc: 0.759939
[100]	valid_0's auc: 0.759939
[1]	valid_0's auc: 0.762215
[2]	valid_0's auc: 0.762215
[3]	valid_0's auc: 0.762215
[4]	valid_0's auc: 0.762215
[5]	valid_0's auc: 0.762215
[6]	valid_0's auc: 0.762215
[7]	valid_0's auc: 0.762215
[8]	valid_0's auc: 0.762215
[9]	valid_0's auc: 0.762215
[10]	valid_0's auc: 0.762215
[11]	valid_0's auc: 0.762215
[12]	valid_0's auc: 0.762215
[13]	valid_0's auc: 0.762215
[14]	valid_0's auc: 0.762215
[15]	valid_0's auc: 0.762215
[16]	valid_0's auc: 0.762215
[17]	valid_0's auc: 0.762215
[18]	valid_0's auc: 0.762215
[19]	valid_0's auc: 0.762215
[20]	valid_0's auc: 0.762215
[21]	valid_0's auc: 0.762215
[22]	valid_0's auc: 0.762215
[23]	valid_0's auc: 0.762215
[24]	valid_0's auc: 0.762215
[25]	valid_0's auc: 0.762215
[26]	valid_0's auc: 0.762215
[27]	valid_0's auc: 0.762215
[28]	valid_0's auc: 0.762215
[29]	valid_0's auc: 0.762215
[30]	valid_0's auc: 0.762215
[31]	valid_0's auc: 0.762215
[32]	valid_0's auc: 0.762215
[33]	valid_0's auc: 0.762215
[34]	valid_0's auc: 0.762215
[35]	valid_0's auc: 0.762215
[36]	valid_0's auc: 0.762215
[37]	valid_0's auc: 0.762215
[38]	valid_0's auc: 0.762215
[39]	valid_0's auc: 0.762215
[40]	valid_0's auc: 0.762215
[41]	valid_0's auc: 0.762215
[42]	valid_0's auc: 0.762215
[43]	valid_0's auc: 0.762215
[44]	valid_0's auc: 0.762215
[45]	valid_0's auc: 0.762215
[46]	valid_0's auc: 0.762215
[47]	valid_0's auc: 0.762215
[48]	valid_0's auc: 0.762215
[49]	valid_0's auc: 0.762215
[50]	valid_0's auc: 0.762215
[51]	valid_0's auc: 0.762215
[52]	valid_0's auc: 0.762512
[53]	valid_0's auc: 0.762512
[54]	valid_0's auc: 0.762512
[55]	valid_0's auc: 0.762512
[56]	valid_0's auc: 0.762512
[57]	valid_0's auc: 0.762512
[58]	valid_0's auc: 0.762512
[59]	valid_0's auc: 0.762512
[60]	valid_0's auc: 0.762512
[61]	valid_0's auc: 0.762512
[62]	valid_0's auc: 0.762512
[63]	valid_0's auc: 0.762512
[64]	valid_0's auc: 0.762512
[65]	valid_0's auc: 0.762512
[66]	valid_0's auc: 0.762512
[67]	valid_0's auc: 0.762512
[68]	valid_0's auc: 0.762512
[69]	valid_0's auc: 0.762512
[70]	valid_0's auc: 0.762512
[71]	valid_0's auc: 0.762512
[72]	valid_0's auc: 0.762512
[73]	valid_0's auc: 0.762512
[74]	valid_0's auc: 0.762512
[75]	valid_0's auc: 0.762512
[76]	valid_0's auc: 0.762512
[77]	valid_0's auc: 0.762512
[78]	valid_0's auc: 0.762512
[79]	valid_0's auc: 0.762512
[80]	valid_0's auc: 0.762562
[81]	valid_0's auc: 0.762562
[82]	valid_0's auc: 0.76264
[83]	valid_0's auc: 0.76264
[84]	valid_0's auc: 0.76264
[85]	valid_0's auc: 0.76264
[86]	valid_0's auc: 0.76264
[87]	valid_0's auc: 0.76264
[88]	valid_0's auc: 0.76264
[89]	valid_0's auc: 0.76264
[90]	valid_0's auc: 0.76264
[91]	valid_0's auc: 0.76264
[92]	valid_0's auc: 0.76264
[93]	valid_0's auc: 0.76264
[94]	valid_0's auc: 0.76264
[95]	valid_0's auc: 0.762562
[96]	valid_0's auc: 0.76264
[97]	valid_0's auc: 0.76264
[98]	valid_0's auc: 0.76264
[99]	valid_0's auc: 0.76264
[100]	valid_0's auc: 0.76264
[1]	valid_0's auc: 0.760477
[2]	valid_0's auc: 0.760477
[3]	valid_0's auc: 0.760477
[4]	valid_0's auc: 0.760477
[5]	valid_0's auc: 0.760477
[6]	valid_0's auc: 0.760477
[7]	valid_0's auc: 0.760477
[8]	valid_0's auc: 0.760477
[9]	valid_0's auc: 0.760477
[10]	valid_0's auc: 0.760477
[11]	valid_0's auc: 0.760477
[12]	valid_0's auc: 0.760477
[13]	valid_0's auc: 0.760477
[14]	valid_0's auc: 0.760477
[15]	valid_0's auc: 0.760477
[16]	valid_0's auc: 0.760477
[17]	valid_0's auc: 0.760477
[18]	valid_0's auc: 0.760477
[19]	valid_0's auc: 0.760477
[20]	valid_0's auc: 0.760477
[21]	valid_0's auc: 0.760477
[22]	valid_0's auc: 0.760477
[23]	valid_0's auc: 0.760477
[24]	valid_0's auc: 0.760477
[25]	valid_0's auc: 0.760477
[26]	valid_0's auc: 0.760477
[27]	valid_0's auc: 0.760477
[28]	valid_0's auc: 0.760477
[29]	valid_0's auc: 0.760477
[30]	valid_0's auc: 0.760477
[31]	valid_0's auc: 0.760477
[32]	valid_0's auc: 0.760477
[33]	valid_0's auc: 0.760477
[34]	valid_0's auc: 0.760477
[35]	valid_0's auc: 0.760477
[36]	valid_0's auc: 0.760477
[37]	valid_0's auc: 0.760477
[38]	valid_0's auc: 0.760477
[39]	valid_0's auc: 0.760477
[40]	valid_0's auc: 0.760477
[41]	valid_0's auc: 0.760477
[42]	valid_0's auc: 0.760477
[43]	valid_0's auc: 0.760477
[44]	valid_0's auc: 0.760477
[45]	valid_0's auc: 0.760477
[46]	valid_0's auc: 0.760477
[47]	valid_0's auc: 0.760477
[48]	valid_0's auc: 0.760477
[49]	valid_0's auc: 0.760477
[50]	valid_0's auc: 0.760477
[51]	valid_0's auc: 0.760477
[52]	valid_0's auc: 0.760477
[53]	valid_0's auc: 0.760477
[54]	valid_0's auc: 0.760477
[55]	valid_0's auc: 0.760477
[56]	valid_0's auc: 0.760477
[57]	valid_0's auc: 0.760477
[58]	valid_0's auc: 0.760477
[59]	valid_0's auc: 0.760477
[60]	valid_0's auc: 0.760477
[61]	valid_0's auc: 0.760477
[62]	valid_0's auc: 0.760477
[63]	valid_0's auc: 0.760477
[64]	valid_0's auc: 0.760477
[65]	valid_0's auc: 0.760477
[66]	valid_0's auc: 0.760477
[67]	valid_0's auc: 0.760477
[68]	valid_0's auc: 0.760477
[69]	valid_0's auc: 0.760477
[70]	valid_0's auc: 0.760477
[71]	valid_0's auc: 0.760477
[72]	valid_0's auc: 0.760477
[73]	valid_0's auc: 0.760477
[74]	valid_0's auc: 0.760477
[75]	valid_0's auc: 0.760477
[76]	valid_0's auc: 0.760477
[77]	valid_0's auc: 0.760477
[78]	valid_0's auc: 0.760477
[79]	valid_0's auc: 0.760477
[80]	valid_0's auc: 0.760477
[81]	valid_0's auc: 0.760477
[82]	valid_0's auc: 0.760477
[83]	valid_0's auc: 0.760477
[84]	valid_0's auc: 0.760477
[85]	valid_0's auc: 0.760477
[86]	valid_0's auc: 0.760477
[87]	valid_0's auc: 0.760477
[88]	valid_0's auc: 0.760477
[89]	valid_0's auc: 0.760477
[90]	valid_0's auc: 0.760477
[91]	valid_0's auc: 0.760477
[92]	valid_0's auc: 0.760477
[93]	valid_0's auc: 0.760477
[94]	valid_0's auc: 0.760477
[95]	valid_0's auc: 0.760477
[96]	valid_0's auc: 0.760477
[97]	valid_0's auc: 0.760477
[98]	valid_0's auc: 0.760477
[99]	valid_0's auc: 0.760477
[100]	valid_0's auc: 0.760477
[1]	valid_0's auc: 0.7495
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.7495
[3]	valid_0's auc: 0.7495
[4]	valid_0's auc: 0.7495
[5]	valid_0's auc: 0.7495
[6]	valid_0's auc: 0.751338
[7]	valid_0's auc: 0.751338
[8]	valid_0's auc: 0.749299
[9]	valid_0's auc: 0.750152
[10]	valid_0's auc: 0.750327
[11]	valid_0's auc: 0.753588
[12]	valid_0's auc: 0.751726
[13]	valid_0's auc: 0.753588
[14]	valid_0's auc: 0.75362
[15]	valid_0's auc: 0.752114
[16]	valid_0's auc: 0.75397
[17]	valid_0's auc: 0.752302
[18]	valid_0's auc: 0.75454
[19]	valid_0's auc: 0.754551
[20]	valid_0's auc: 0.755186
[21]	valid_0's auc: 0.755188
[22]	valid_0's auc: 0.756006
[23]	valid_0's auc: 0.75601
[24]	valid_0's auc: 0.75633
[25]	valid_0's auc: 0.756265
[26]	valid_0's auc: 0.756053
[27]	valid_0's auc: 0.756274
[28]	valid_0's auc: 0.75762
[29]	valid_0's auc: 0.756782
[30]	valid_0's auc: 0.757634
[31]	valid_0's auc: 0.757675
[32]	valid_0's auc: 0.758008
[33]	valid_0's auc: 0.756743
[34]	valid_0's auc: 0.756602
[35]	valid_0's auc: 0.757746
[36]	valid_0's auc: 0.757017
[37]	valid_0's auc: 0.757261
[38]	valid_0's auc: 0.758645
[39]	valid_0's auc: 0.758549
[40]	valid_0's auc: 0.758784
[41]	valid_0's auc: 0.758708
[42]	valid_0's auc: 0.758747
[43]	valid_0's auc: 0.758709
[44]	valid_0's auc: 0.758743
[45]	valid_0's auc: 0.759398
[46]	valid_0's auc: 0.759397
[47]	valid_0's auc: 0.759717
[48]	valid_0's auc: 0.759915
[49]	valid_0's auc: 0.759838
[50]	valid_0's auc: 0.759863
[51]	valid_0's auc: 0.76599
[52]	valid_0's auc: 0.766056
[53]	valid_0's auc: 0.766283
[54]	valid_0's auc: 0.766718
[55]	valid_0's auc: 0.766806
[56]	valid_0's auc: 0.766793
[57]	valid_0's auc: 0.766794
[58]	valid_0's auc: 0.767103
[59]	valid_0's auc: 0.76718
[60]	valid_0's auc: 0.767451
[61]	valid_0's auc: 0.767644
[62]	valid_0's auc: 0.76792
[63]	valid_0's auc: 0.768226
[64]	valid_0's auc: 0.768386
[65]	valid_0's auc: 0.768722
[66]	valid_0's auc: 0.769498
[67]	valid_0's auc: 0.769579
[68]	valid_0's auc: 0.769717
[69]	valid_0's auc: 0.769656
[70]	valid_0's auc: 0.769725
[71]	valid_0's auc: 0.770004
[72]	valid_0's auc: 0.770071
[73]	valid_0's auc: 0.770156
[74]	valid_0's auc: 0.770421
[75]	valid_0's auc: 0.770445
[76]	valid_0's auc: 0.770643
[77]	valid_0's auc: 0.770704
[78]	valid_0's auc: 0.770812
[79]	valid_0's auc: 0.771044
[80]	valid_0's auc: 0.771391
[81]	valid_0's auc: 0.771437
[82]	valid_0's auc: 0.771472
[83]	valid_0's auc: 0.771566
[84]	valid_0's auc: 0.771689
[85]	valid_0's auc: 0.771736
[86]	valid_0's auc: 0.771801
[87]	valid_0's auc: 0.772011
[88]	valid_0's auc: 0.772728
[89]	valid_0's auc: 0.7728
[90]	valid_0's auc: 0.772898
[91]	valid_0's auc: 0.772956
[92]	valid_0's auc: 0.773111
[93]	valid_0's auc: 0.773019
[94]	valid_0's auc: 0.773114
[95]	valid_0's auc: 0.773192
[96]	valid_0's auc: 0.773362
[97]	valid_0's auc: 0.77347
[98]	valid_0's auc: 0.773557
[99]	valid_0's auc: 0.773817
[100]	valid_0's auc: 0.773901
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.773901
[1]	valid_0's auc: 0.748453
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.749691
[3]	valid_0's auc: 0.751615
[4]	valid_0's auc: 0.751094
[5]	valid_0's auc: 0.752006
[6]	valid_0's auc: 0.751589
[7]	valid_0's auc: 0.756645
[8]	valid_0's auc: 0.756098
[9]	valid_0's auc: 0.756598
[10]	valid_0's auc: 0.756621
[11]	valid_0's auc: 0.756363
[12]	valid_0's auc: 0.756692
[13]	valid_0's auc: 0.756926
[14]	valid_0's auc: 0.757646
[15]	valid_0's auc: 0.75773
[16]	valid_0's auc: 0.761756
[17]	valid_0's auc: 0.763676
[18]	valid_0's auc: 0.763493
[19]	valid_0's auc: 0.765036
[20]	valid_0's auc: 0.765387
[21]	valid_0's auc: 0.765743
[22]	valid_0's auc: 0.76701
[23]	valid_0's auc: 0.768299
[24]	valid_0's auc: 0.768336
[25]	valid_0's auc: 0.768381
[26]	valid_0's auc: 0.768425
[27]	valid_0's auc: 0.768594
[28]	valid_0's auc: 0.768752
[29]	valid_0's auc: 0.76916
[30]	valid_0's auc: 0.769535
[31]	valid_0's auc: 0.769674
[32]	valid_0's auc: 0.76973
[33]	valid_0's auc: 0.76982
[34]	valid_0's auc: 0.769762
[35]	valid_0's auc: 0.769653
[36]	valid_0's auc: 0.769753
[37]	valid_0's auc: 0.769828
[38]	valid_0's auc: 0.769715
[39]	valid_0's auc: 0.769539
[40]	valid_0's auc: 0.769506
[41]	valid_0's auc: 0.769693
[42]	valid_0's auc: 0.769792
[43]	valid_0's auc: 0.769553
[44]	valid_0's auc: 0.769513
[45]	valid_0's auc: 0.769447
[46]	valid_0's auc: 0.770235
[47]	valid_0's auc: 0.770256
[48]	valid_0's auc: 0.770406
[49]	valid_0's auc: 0.770384
[50]	valid_0's auc: 0.770514
[51]	valid_0's auc: 0.770421
[52]	valid_0's auc: 0.770347
[53]	valid_0's auc: 0.770589
[54]	valid_0's auc: 0.770489
[55]	valid_0's auc: 0.770481
[56]	valid_0's auc: 0.770342
[57]	valid_0's auc: 0.770254
[58]	valid_0's auc: 0.770084
[59]	valid_0's auc: 0.770734
[60]	valid_0's auc: 0.770675
[61]	valid_0's auc: 0.77066
[62]	valid_0's auc: 0.770407
[63]	valid_0's auc: 0.770372
[64]	valid_0's auc: 0.770708
[65]	valid_0's auc: 0.770528
[66]	valid_0's auc: 0.770437
[67]	valid_0's auc: 0.770418
[68]	valid_0's auc: 0.77043
[69]	valid_0's auc: 0.770575
Early stopping, best iteration is:
[59]	valid_0's auc: 0.770734
[1]	valid_0's auc: 0.748867
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.762521
[3]	valid_0's auc: 0.764062
[4]	valid_0's auc: 0.763258
[5]	valid_0's auc: 0.762373
[6]	valid_0's auc: 0.764062
[7]	valid_0's auc: 0.763748
[8]	valid_0's auc: 0.763192
[9]	valid_0's auc: 0.762451
[10]	valid_0's auc: 0.762405
[11]	valid_0's auc: 0.762414
[12]	valid_0's auc: 0.762585
[13]	valid_0's auc: 0.764478
[14]	valid_0's auc: 0.763795
[15]	valid_0's auc: 0.763768
[16]	valid_0's auc: 0.765035
[17]	valid_0's auc: 0.764958
[18]	valid_0's auc: 0.764806
[19]	valid_0's auc: 0.76476
[20]	valid_0's auc: 0.764852
[21]	valid_0's auc: 0.765044
[22]	valid_0's auc: 0.765812
[23]	valid_0's auc: 0.766688
[24]	valid_0's auc: 0.766439
[25]	valid_0's auc: 0.766191
[26]	valid_0's auc: 0.765874
[27]	valid_0's auc: 0.766426
[28]	valid_0's auc: 0.766262
[29]	valid_0's auc: 0.767146
[30]	valid_0's auc: 0.767198
[31]	valid_0's auc: 0.767499
[32]	valid_0's auc: 0.767265
[33]	valid_0's auc: 0.767585
[34]	valid_0's auc: 0.767686
[35]	valid_0's auc: 0.767393
[36]	valid_0's auc: 0.76741
[37]	valid_0's auc: 0.767271
[38]	valid_0's auc: 0.767502
[39]	valid_0's auc: 0.767493
[40]	valid_0's auc: 0.768043
[41]	valid_0's auc: 0.768017
[42]	valid_0's auc: 0.767905
[43]	valid_0's auc: 0.768008
[44]	valid_0's auc: 0.767984
[45]	valid_0's auc: 0.76836
[46]	valid_0's auc: 0.768179
[47]	valid_0's auc: 0.768016
[48]	valid_0's auc: 0.768079
[49]	valid_0's auc: 0.768291
[50]	valid_0's auc: 0.768727
[51]	valid_0's auc: 0.768816
[52]	valid_0's auc: 0.768913
[53]	valid_0's auc: 0.769575
[54]	valid_0's auc: 0.769459
[55]	valid_0's auc: 0.769078
[56]	valid_0's auc: 0.769225
[57]	valid_0's auc: 0.769868
[58]	valid_0's auc: 0.769948
[59]	valid_0's auc: 0.770082
[60]	valid_0's auc: 0.770177
[61]	valid_0's auc: 0.770308
[62]	valid_0's auc: 0.770811
[63]	valid_0's auc: 0.771087
[64]	valid_0's auc: 0.771062
[65]	valid_0's auc: 0.771391
[66]	valid_0's auc: 0.771491
[67]	valid_0's auc: 0.771609
[68]	valid_0's auc: 0.771634
[69]	valid_0's auc: 0.771639
[70]	valid_0's auc: 0.77184
[71]	valid_0's auc: 0.772131
[72]	valid_0's auc: 0.772183
[73]	valid_0's auc: 0.772289
[74]	valid_0's auc: 0.772651
[75]	valid_0's auc: 0.772794
[76]	valid_0's auc: 0.772885
[77]	valid_0's auc: 0.77315
[78]	valid_0's auc: 0.773337
[79]	valid_0's auc: 0.773271
[80]	valid_0's auc: 0.77366
[81]	valid_0's auc: 0.773411
[82]	valid_0's auc: 0.773837
[83]	valid_0's auc: 0.773812
[84]	valid_0's auc: 0.774001
[85]	valid_0's auc: 0.774302
[86]	valid_0's auc: 0.774236
[87]	valid_0's auc: 0.774115
[88]	valid_0's auc: 0.77435
[89]	valid_0's auc: 0.774409
[90]	valid_0's auc: 0.774726
[91]	valid_0's auc: 0.774635
[92]	valid_0's auc: 0.774694
[93]	valid_0's auc: 0.774737
[94]	valid_0's auc: 0.774803
[95]	valid_0's auc: 0.775202
[96]	valid_0's auc: 0.775197
[97]	valid_0's auc: 0.775361
[98]	valid_0's auc: 0.775361
[99]	valid_0's auc: 0.775321
[100]	valid_0's auc: 0.775437
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.775437
[1]	valid_0's auc: 0.748286
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.748286
[3]	valid_0's auc: 0.749259
[4]	valid_0's auc: 0.749362
[5]	valid_0's auc: 0.748976
[6]	valid_0's auc: 0.74936
[7]	valid_0's auc: 0.749362
[8]	valid_0's auc: 0.749138
[9]	valid_0's auc: 0.752684
[10]	valid_0's auc: 0.752831
[11]	valid_0's auc: 0.75366
[12]	valid_0's auc: 0.75411
[13]	valid_0's auc: 0.761886
[14]	valid_0's auc: 0.762058
[15]	valid_0's auc: 0.761342
[16]	valid_0's auc: 0.762343
[17]	valid_0's auc: 0.763406
[18]	valid_0's auc: 0.763392
[19]	valid_0's auc: 0.763273
[20]	valid_0's auc: 0.76458
[21]	valid_0's auc: 0.763966
[22]	valid_0's auc: 0.764795
[23]	valid_0's auc: 0.765132
[24]	valid_0's auc: 0.765137
[25]	valid_0's auc: 0.765224
[26]	valid_0's auc: 0.765178
[27]	valid_0's auc: 0.765203
[28]	valid_0's auc: 0.765275
[29]	valid_0's auc: 0.76528
[30]	valid_0's auc: 0.765251
[31]	valid_0's auc: 0.766388
[32]	valid_0's auc: 0.76663
[33]	valid_0's auc: 0.766285
[34]	valid_0's auc: 0.766925
[35]	valid_0's auc: 0.766744
[36]	valid_0's auc: 0.766676
[37]	valid_0's auc: 0.767472
[38]	valid_0's auc: 0.76736
[39]	valid_0's auc: 0.767359
[40]	valid_0's auc: 0.76739
[41]	valid_0's auc: 0.767703
[42]	valid_0's auc: 0.767477
[43]	valid_0's auc: 0.767541
[44]	valid_0's auc: 0.767863
[45]	valid_0's auc: 0.767699
[46]	valid_0's auc: 0.767717
[47]	valid_0's auc: 0.76795
[48]	valid_0's auc: 0.767933
[49]	valid_0's auc: 0.767932
[50]	valid_0's auc: 0.768455
[51]	valid_0's auc: 0.768476
[52]	valid_0's auc: 0.76847
[53]	valid_0's auc: 0.76849
[54]	valid_0's auc: 0.768749
[55]	valid_0's auc: 0.768557
[56]	valid_0's auc: 0.769161
[57]	valid_0's auc: 0.769234
[58]	valid_0's auc: 0.769464
[59]	valid_0's auc: 0.769393
[60]	valid_0's auc: 0.769274
[61]	valid_0's auc: 0.769263
[62]	valid_0's auc: 0.769402
[63]	valid_0's auc: 0.769438
[64]	valid_0's auc: 0.76958
[65]	valid_0's auc: 0.769563
[66]	valid_0's auc: 0.769568
[67]	valid_0's auc: 0.769779
[68]	valid_0's auc: 0.769926
[69]	valid_0's auc: 0.769944
[70]	valid_0's auc: 0.770306
[71]	valid_0's auc: 0.770275
[72]	valid_0's auc: 0.770372
[73]	valid_0's auc: 0.771305
[74]	valid_0's auc: 0.771477
[75]	valid_0's auc: 0.771405
[76]	valid_0's auc: 0.771388
[77]	valid_0's auc: 0.771519
[78]	valid_0's auc: 0.771588
[79]	valid_0's auc: 0.772025
[80]	valid_0's auc: 0.772217
[81]	valid_0's auc: 0.772321
[82]	valid_0's auc: 0.772388
[83]	valid_0's auc: 0.772701
[84]	valid_0's auc: 0.772712
[85]	valid_0's auc: 0.772697
[86]	valid_0's auc: 0.773027
[87]	valid_0's auc: 0.773182
[88]	valid_0's auc: 0.773352
[89]	valid_0's auc: 0.773592
[90]	valid_0's auc: 0.773594
[91]	valid_0's auc: 0.773858
[92]	valid_0's auc: 0.773981
[93]	valid_0's auc: 0.774091
[94]	valid_0's auc: 0.77409
[95]	valid_0's auc: 0.774341
[96]	valid_0's auc: 0.774392
[97]	valid_0's auc: 0.774483
[98]	valid_0's auc: 0.774577
[99]	valid_0's auc: 0.774593
[100]	valid_0's auc: 0.774762
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.774762
[1]	valid_0's auc: 0.757043
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.757043
[3]	valid_0's auc: 0.757043
[4]	valid_0's auc: 0.757043
[5]	valid_0's auc: 0.757043
[6]	valid_0's auc: 0.757043
[7]	valid_0's auc: 0.757043
[8]	valid_0's auc: 0.757043
[9]	valid_0's auc: 0.757043
[10]	valid_0's auc: 0.757043
[11]	valid_0's auc: 0.758625
[12]	valid_0's auc: 0.759292
[13]	valid_0's auc: 0.760187
[14]	valid_0's auc: 0.760275
[15]	valid_0's auc: 0.760962
[16]	valid_0's auc: 0.761238
[17]	valid_0's auc: 0.761211
[18]	valid_0's auc: 0.7613
[19]	valid_0's auc: 0.761395
[20]	valid_0's auc: 0.761334
[21]	valid_0's auc: 0.761432
[22]	valid_0's auc: 0.761521
[23]	valid_0's auc: 0.761593
[24]	valid_0's auc: 0.761667
[25]	valid_0's auc: 0.761781
[26]	valid_0's auc: 0.762145
[27]	valid_0's auc: 0.762147
[28]	valid_0's auc: 0.762012
[29]	valid_0's auc: 0.762029
[30]	valid_0's auc: 0.76205
[31]	valid_0's auc: 0.7623
[32]	valid_0's auc: 0.762442
[33]	valid_0's auc: 0.762826
[34]	valid_0's auc: 0.762886
[35]	valid_0's auc: 0.762847
[36]	valid_0's auc: 0.762868
[37]	valid_0's auc: 0.762924
[38]	valid_0's auc: 0.763281
[39]	valid_0's auc: 0.76328
[40]	valid_0's auc: 0.763148
[41]	valid_0's auc: 0.763354
[42]	valid_0's auc: 0.764574
[43]	valid_0's auc: 0.764801
[44]	valid_0's auc: 0.764879
[45]	valid_0's auc: 0.765002
[46]	valid_0's auc: 0.76495
[47]	valid_0's auc: 0.765176
[48]	valid_0's auc: 0.765224
[49]	valid_0's auc: 0.76525
[50]	valid_0's auc: 0.765488
[51]	valid_0's auc: 0.765558
[52]	valid_0's auc: 0.765553
[53]	valid_0's auc: 0.765663
[54]	valid_0's auc: 0.765647
[55]	valid_0's auc: 0.766129
[56]	valid_0's auc: 0.766111
[57]	valid_0's auc: 0.765746
[58]	valid_0's auc: 0.76651
[59]	valid_0's auc: 0.766548
[60]	valid_0's auc: 0.766745
[61]	valid_0's auc: 0.771389
[62]	valid_0's auc: 0.771551
[63]	valid_0's auc: 0.771669
[64]	valid_0's auc: 0.771897
[65]	valid_0's auc: 0.771982
[66]	valid_0's auc: 0.772083
[67]	valid_0's auc: 0.772517
[68]	valid_0's auc: 0.772629
[69]	valid_0's auc: 0.773429
[70]	valid_0's auc: 0.773844
[71]	valid_0's auc: 0.773989
[72]	valid_0's auc: 0.774098
[73]	valid_0's auc: 0.775047
[74]	valid_0's auc: 0.775309
[75]	valid_0's auc: 0.775532
[76]	valid_0's auc: 0.775546
[77]	valid_0's auc: 0.77575
[78]	valid_0's auc: 0.775891
[79]	valid_0's auc: 0.776002
[80]	valid_0's auc: 0.776212
[81]	valid_0's auc: 0.776492
[82]	valid_0's auc: 0.776616
[83]	valid_0's auc: 0.776783
[84]	valid_0's auc: 0.776891
[85]	valid_0's auc: 0.776979
[86]	valid_0's auc: 0.777022
[87]	valid_0's auc: 0.777174
[88]	valid_0's auc: 0.777333
[89]	valid_0's auc: 0.77745
[90]	valid_0's auc: 0.77752
[91]	valid_0's auc: 0.777574
[92]	valid_0's auc: 0.777644
[93]	valid_0's auc: 0.777846
[94]	valid_0's auc: 0.777805
[95]	valid_0's auc: 0.777825
[96]	valid_0's auc: 0.777932
[97]	valid_0's auc: 0.778095
[98]	valid_0's auc: 0.778152
[99]	valid_0's auc: 0.778222
[100]	valid_0's auc: 0.778231
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.778231
[1]	valid_0's auc: 0.752298
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.759276
[3]	valid_0's auc: 0.759067
[4]	valid_0's auc: 0.759079
[5]	valid_0's auc: 0.759532
[6]	valid_0's auc: 0.759549
[7]	valid_0's auc: 0.761225
[8]	valid_0's auc: 0.761092
[9]	valid_0's auc: 0.761531
[10]	valid_0's auc: 0.761654
[11]	valid_0's auc: 0.76183
[12]	valid_0's auc: 0.761651
[13]	valid_0's auc: 0.761898
[14]	valid_0's auc: 0.763021
[15]	valid_0's auc: 0.763917
[16]	valid_0's auc: 0.764055
[17]	valid_0's auc: 0.764371
[18]	valid_0's auc: 0.764469
[19]	valid_0's auc: 0.764181
[20]	valid_0's auc: 0.764792
[21]	valid_0's auc: 0.764872
[22]	valid_0's auc: 0.766306
[23]	valid_0's auc: 0.766454
[24]	valid_0's auc: 0.766814
[25]	valid_0's auc: 0.770826
[26]	valid_0's auc: 0.77106
[27]	valid_0's auc: 0.771193
[28]	valid_0's auc: 0.771275
[29]	valid_0's auc: 0.771764
[30]	valid_0's auc: 0.772921
[31]	valid_0's auc: 0.772755
[32]	valid_0's auc: 0.772821
[33]	valid_0's auc: 0.773079
[34]	valid_0's auc: 0.774193
[35]	valid_0's auc: 0.774257
[36]	valid_0's auc: 0.774432
[37]	valid_0's auc: 0.774706
[38]	valid_0's auc: 0.775102
[39]	valid_0's auc: 0.775298
[40]	valid_0's auc: 0.775427
[41]	valid_0's auc: 0.77582
[42]	valid_0's auc: 0.775949
[43]	valid_0's auc: 0.776069
[44]	valid_0's auc: 0.776087
[45]	valid_0's auc: 0.776161
[46]	valid_0's auc: 0.776193
[47]	valid_0's auc: 0.776459
[48]	valid_0's auc: 0.776804
[49]	valid_0's auc: 0.777155
[50]	valid_0's auc: 0.777061
[51]	valid_0's auc: 0.777348
[52]	valid_0's auc: 0.777465
[53]	valid_0's auc: 0.777494
[54]	valid_0's auc: 0.777575
[55]	valid_0's auc: 0.777784
[56]	valid_0's auc: 0.777784
[57]	valid_0's auc: 0.777785
[58]	valid_0's auc: 0.77793
[59]	valid_0's auc: 0.778083
[60]	valid_0's auc: 0.778009
[61]	valid_0's auc: 0.778058
[62]	valid_0's auc: 0.778052
[63]	valid_0's auc: 0.777929
[64]	valid_0's auc: 0.778139
[65]	valid_0's auc: 0.778281
[66]	valid_0's auc: 0.778265
[67]	valid_0's auc: 0.778265
[68]	valid_0's auc: 0.77838
[69]	valid_0's auc: 0.778393
[70]	valid_0's auc: 0.778366
[71]	valid_0's auc: 0.778427
[72]	valid_0's auc: 0.778364
[73]	valid_0's auc: 0.778342
[74]	valid_0's auc: 0.778339
[75]	valid_0's auc: 0.778291
[76]	valid_0's auc: 0.778306
[77]	valid_0's auc: 0.778345
[78]	valid_0's auc: 0.778399
[79]	valid_0's auc: 0.778372
[80]	valid_0's auc: 0.778413
[81]	valid_0's auc: 0.778537
[82]	valid_0's auc: 0.77857
[83]	valid_0's auc: 0.778721
[84]	valid_0's auc: 0.778712
[85]	valid_0's auc: 0.778755
[86]	valid_0's auc: 0.778862
[87]	valid_0's auc: 0.779
[88]	valid_0's auc: 0.780026
[89]	valid_0's auc: 0.780002
[90]	valid_0's auc: 0.780006
[91]	valid_0's auc: 0.780188
[92]	valid_0's auc: 0.780227
[93]	valid_0's auc: 0.780227
[94]	valid_0's auc: 0.780421
[95]	valid_0's auc: 0.780511
[96]	valid_0's auc: 0.780642
[97]	valid_0's auc: 0.780721
[98]	valid_0's auc: 0.780743
[99]	valid_0's auc: 0.780898
[100]	valid_0's auc: 0.780924
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.780924
[1]	valid_0's auc: 0.754636
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.7683
[3]	valid_0's auc: 0.76755
[4]	valid_0's auc: 0.771673
[5]	valid_0's auc: 0.772091
[6]	valid_0's auc: 0.771553
[7]	valid_0's auc: 0.771367
[8]	valid_0's auc: 0.770636
[9]	valid_0's auc: 0.771707
[10]	valid_0's auc: 0.771595
[11]	valid_0's auc: 0.771764
[12]	valid_0's auc: 0.771648
[13]	valid_0's auc: 0.771187
[14]	valid_0's auc: 0.77186
[15]	valid_0's auc: 0.771543
Early stopping, best iteration is:
[5]	valid_0's auc: 0.772091
[1]	valid_0's auc: 0.757698
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.757698
[3]	valid_0's auc: 0.758663
[4]	valid_0's auc: 0.757968
[5]	valid_0's auc: 0.757959
[6]	valid_0's auc: 0.757982
[7]	valid_0's auc: 0.758325
[8]	valid_0's auc: 0.758427
[9]	valid_0's auc: 0.760477
[10]	valid_0's auc: 0.761208
[11]	valid_0's auc: 0.762052
[12]	valid_0's auc: 0.762095
[13]	valid_0's auc: 0.762074
[14]	valid_0's auc: 0.762127
[15]	valid_0's auc: 0.762133
[16]	valid_0's auc: 0.762254
[17]	valid_0's auc: 0.762284
[18]	valid_0's auc: 0.761951
[19]	valid_0's auc: 0.761953
[20]	valid_0's auc: 0.762233
[21]	valid_0's auc: 0.763617
[22]	valid_0's auc: 0.763017
[23]	valid_0's auc: 0.76413
[24]	valid_0's auc: 0.764133
[25]	valid_0's auc: 0.764182
[26]	valid_0's auc: 0.764466
[27]	valid_0's auc: 0.76447
[28]	valid_0's auc: 0.764439
[29]	valid_0's auc: 0.764552
[30]	valid_0's auc: 0.764529
[31]	valid_0's auc: 0.764645
[32]	valid_0's auc: 0.76461
[33]	valid_0's auc: 0.764749
[34]	valid_0's auc: 0.764821
[35]	valid_0's auc: 0.764874
[36]	valid_0's auc: 0.765379
[37]	valid_0's auc: 0.76569
[38]	valid_0's auc: 0.771086
[39]	valid_0's auc: 0.771052
[40]	valid_0's auc: 0.770973
[41]	valid_0's auc: 0.770923
[42]	valid_0's auc: 0.771175
[43]	valid_0's auc: 0.771435
[44]	valid_0's auc: 0.771566
[45]	valid_0's auc: 0.771553
[46]	valid_0's auc: 0.77223
[47]	valid_0's auc: 0.772217
[48]	valid_0's auc: 0.772403
[49]	valid_0's auc: 0.772724
[50]	valid_0's auc: 0.772727
[51]	valid_0's auc: 0.772736
[52]	valid_0's auc: 0.772851
[53]	valid_0's auc: 0.772891
[54]	valid_0's auc: 0.773151
[55]	valid_0's auc: 0.773219
[56]	valid_0's auc: 0.773409
[57]	valid_0's auc: 0.773384
[58]	valid_0's auc: 0.773475
[59]	valid_0's auc: 0.773402
[60]	valid_0's auc: 0.773757
[61]	valid_0's auc: 0.773803
[62]	valid_0's auc: 0.774054
[63]	valid_0's auc: 0.774023
[64]	valid_0's auc: 0.774089
[65]	valid_0's auc: 0.774476
[66]	valid_0's auc: 0.774687
[67]	valid_0's auc: 0.774747
[68]	valid_0's auc: 0.774837
[69]	valid_0's auc: 0.774838
[70]	valid_0's auc: 0.775095
[71]	valid_0's auc: 0.775076
[72]	valid_0's auc: 0.775126
[73]	valid_0's auc: 0.775658
[74]	valid_0's auc: 0.775383
[75]	valid_0's auc: 0.776725
[76]	valid_0's auc: 0.776896
[77]	valid_0's auc: 0.777285
[78]	valid_0's auc: 0.777833
[79]	valid_0's auc: 0.778284
[80]	valid_0's auc: 0.778558
[81]	valid_0's auc: 0.778694
[82]	valid_0's auc: 0.778815
[83]	valid_0's auc: 0.779367
[84]	valid_0's auc: 0.779458
[85]	valid_0's auc: 0.779447
[86]	valid_0's auc: 0.779756
[87]	valid_0's auc: 0.779789
[88]	valid_0's auc: 0.779914
[89]	valid_0's auc: 0.780055
[90]	valid_0's auc: 0.780157
[91]	valid_0's auc: 0.780275
[92]	valid_0's auc: 0.780355
[93]	valid_0's auc: 0.780553
[94]	valid_0's auc: 0.78061
[95]	valid_0's auc: 0.780827
[96]	valid_0's auc: 0.780831
[97]	valid_0's auc: 0.781015
[98]	valid_0's auc: 0.781166
[99]	valid_0's auc: 0.781131
[100]	valid_0's auc: 0.781194
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.781194
[1]	valid_0's auc: 0.760154
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.760161
[3]	valid_0's auc: 0.760248
[4]	valid_0's auc: 0.760916
[5]	valid_0's auc: 0.761004
[6]	valid_0's auc: 0.76127
[7]	valid_0's auc: 0.7613
[8]	valid_0's auc: 0.76127
[9]	valid_0's auc: 0.761278
[10]	valid_0's auc: 0.761278
[11]	valid_0's auc: 0.761268
[12]	valid_0's auc: 0.761437
[13]	valid_0's auc: 0.76341
[14]	valid_0's auc: 0.764528
[15]	valid_0's auc: 0.764469
[16]	valid_0's auc: 0.766352
[17]	valid_0's auc: 0.767053
[18]	valid_0's auc: 0.766859
[19]	valid_0's auc: 0.767212
[20]	valid_0's auc: 0.766872
[21]	valid_0's auc: 0.766866
[22]	valid_0's auc: 0.767423
[23]	valid_0's auc: 0.767323
[24]	valid_0's auc: 0.767413
[25]	valid_0's auc: 0.767656
[26]	valid_0's auc: 0.767654
[27]	valid_0's auc: 0.767613
[28]	valid_0's auc: 0.767717
[29]	valid_0's auc: 0.768226
[30]	valid_0's auc: 0.768494
[31]	valid_0's auc: 0.768577
[32]	valid_0's auc: 0.768808
[33]	valid_0's auc: 0.768591
[34]	valid_0's auc: 0.768697
[35]	valid_0's auc: 0.769098
[36]	valid_0's auc: 0.769009
[37]	valid_0's auc: 0.769042
[38]	valid_0's auc: 0.769575
[39]	valid_0's auc: 0.76962
[40]	valid_0's auc: 0.76958
[41]	valid_0's auc: 0.769666
[42]	valid_0's auc: 0.769771
[43]	valid_0's auc: 0.769773
[44]	valid_0's auc: 0.769946
[45]	valid_0's auc: 0.769864
[46]	valid_0's auc: 0.770008
[47]	valid_0's auc: 0.770105
[48]	valid_0's auc: 0.770358
[49]	valid_0's auc: 0.770544
[50]	valid_0's auc: 0.770754
[51]	valid_0's auc: 0.770966
[52]	valid_0's auc: 0.770991
[53]	valid_0's auc: 0.770996
[54]	valid_0's auc: 0.771023
[55]	valid_0's auc: 0.771135
[56]	valid_0's auc: 0.775167
[57]	valid_0's auc: 0.775182
[58]	valid_0's auc: 0.775229
[59]	valid_0's auc: 0.775415
[60]	valid_0's auc: 0.775413
[61]	valid_0's auc: 0.775489
[62]	valid_0's auc: 0.775545
[63]	valid_0's auc: 0.775602
[64]	valid_0's auc: 0.77607
[65]	valid_0's auc: 0.776277
[66]	valid_0's auc: 0.777002
[67]	valid_0's auc: 0.7773
[68]	valid_0's auc: 0.777466
[69]	valid_0's auc: 0.778277
[70]	valid_0's auc: 0.778481
[71]	valid_0's auc: 0.779101
[72]	valid_0's auc: 0.77928
[73]	valid_0's auc: 0.779498
[74]	valid_0's auc: 0.779962
[75]	valid_0's auc: 0.780241
[76]	valid_0's auc: 0.78028
[77]	valid_0's auc: 0.78044
[78]	valid_0's auc: 0.780489
[79]	valid_0's auc: 0.780655
[80]	valid_0's auc: 0.780801
[81]	valid_0's auc: 0.780863
[82]	valid_0's auc: 0.781056
[83]	valid_0's auc: 0.781219
[84]	valid_0's auc: 0.781239
[85]	valid_0's auc: 0.781329
[86]	valid_0's auc: 0.781488
[87]	valid_0's auc: 0.781692
[88]	valid_0's auc: 0.781652
[89]	valid_0's auc: 0.78181
[90]	valid_0's auc: 0.781885
[91]	valid_0's auc: 0.782033
[92]	valid_0's auc: 0.782059
[93]	valid_0's auc: 0.782156
[94]	valid_0's auc: 0.78235
[95]	valid_0's auc: 0.782382
[96]	valid_0's auc: 0.782505
[97]	valid_0's auc: 0.782803
[98]	valid_0's auc: 0.782928
[99]	valid_0's auc: 0.782958
[100]	valid_0's auc: 0.783127
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.783127
[1]	valid_0's auc: 0.759786
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.764657
[3]	valid_0's auc: 0.764758
[4]	valid_0's auc: 0.764375
[5]	valid_0's auc: 0.766985
[6]	valid_0's auc: 0.767135
[7]	valid_0's auc: 0.767228
[8]	valid_0's auc: 0.767524
[9]	valid_0's auc: 0.767575
[10]	valid_0's auc: 0.768459
[11]	valid_0's auc: 0.768472
[12]	valid_0's auc: 0.768308
[13]	valid_0's auc: 0.768742
[14]	valid_0's auc: 0.76912
[15]	valid_0's auc: 0.769523
[16]	valid_0's auc: 0.769527
[17]	valid_0's auc: 0.769747
[18]	valid_0's auc: 0.770015
[19]	valid_0's auc: 0.770021
[20]	valid_0's auc: 0.770216
[21]	valid_0's auc: 0.770394
[22]	valid_0's auc: 0.770491
[23]	valid_0's auc: 0.771372
[24]	valid_0's auc: 0.77146
[25]	valid_0's auc: 0.772478
[26]	valid_0's auc: 0.774667
[27]	valid_0's auc: 0.775156
[28]	valid_0's auc: 0.775394
[29]	valid_0's auc: 0.776522
[30]	valid_0's auc: 0.777059
[31]	valid_0's auc: 0.777769
[32]	valid_0's auc: 0.778398
[33]	valid_0's auc: 0.77855
[34]	valid_0's auc: 0.778852
[35]	valid_0's auc: 0.778991
[36]	valid_0's auc: 0.779339
[37]	valid_0's auc: 0.779584
[38]	valid_0's auc: 0.779981
[39]	valid_0's auc: 0.780291
[40]	valid_0's auc: 0.780597
[41]	valid_0's auc: 0.780625
[42]	valid_0's auc: 0.781065
[43]	valid_0's auc: 0.781104
[44]	valid_0's auc: 0.78122
[45]	valid_0's auc: 0.781282
[46]	valid_0's auc: 0.781359
[47]	valid_0's auc: 0.781428
[48]	valid_0's auc: 0.781493
[49]	valid_0's auc: 0.781566
[50]	valid_0's auc: 0.781573
[51]	valid_0's auc: 0.781657
[52]	valid_0's auc: 0.781654
[53]	valid_0's auc: 0.781991
[54]	valid_0's auc: 0.781977
[55]	valid_0's auc: 0.781937
[56]	valid_0's auc: 0.782062
[57]	valid_0's auc: 0.782204
[58]	valid_0's auc: 0.782183
[59]	valid_0's auc: 0.782284
[60]	valid_0's auc: 0.78236
[61]	valid_0's auc: 0.782299
[62]	valid_0's auc: 0.782241
[63]	valid_0's auc: 0.782255
[64]	valid_0's auc: 0.782392
[65]	valid_0's auc: 0.782954
[66]	valid_0's auc: 0.783
[67]	valid_0's auc: 0.783067
[68]	valid_0's auc: 0.783055
[69]	valid_0's auc: 0.783199
[70]	valid_0's auc: 0.784276
[71]	valid_0's auc: 0.784242
[72]	valid_0's auc: 0.784126
[73]	valid_0's auc: 0.784678
[74]	valid_0's auc: 0.784838
[75]	valid_0's auc: 0.784647
[76]	valid_0's auc: 0.78471
[77]	valid_0's auc: 0.785257
[78]	valid_0's auc: 0.785353
[79]	valid_0's auc: 0.785299
[80]	valid_0's auc: 0.785447
[81]	valid_0's auc: 0.785441
[82]	valid_0's auc: 0.785337
[83]	valid_0's auc: 0.785524
[84]	valid_0's auc: 0.785471
[85]	valid_0's auc: 0.785644
[86]	valid_0's auc: 0.785397
[87]	valid_0's auc: 0.785544
[88]	valid_0's auc: 0.785513
[89]	valid_0's auc: 0.785843
[90]	valid_0's auc: 0.785886
[91]	valid_0's auc: 0.786101
[92]	valid_0's auc: 0.786287
[93]	valid_0's auc: 0.786388
[94]	valid_0's auc: 0.786446
[95]	valid_0's auc: 0.786717
[96]	valid_0's auc: 0.786801
[97]	valid_0's auc: 0.786936
[98]	valid_0's auc: 0.786962
[99]	valid_0's auc: 0.787115
[100]	valid_0's auc: 0.787182
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.787182
[1]	valid_0's auc: 0.762215
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.772303
[3]	valid_0's auc: 0.772421
[4]	valid_0's auc: 0.776396
[5]	valid_0's auc: 0.776041
[6]	valid_0's auc: 0.775312
[7]	valid_0's auc: 0.775295
[8]	valid_0's auc: 0.774564
[9]	valid_0's auc: 0.775867
[10]	valid_0's auc: 0.776069
[11]	valid_0's auc: 0.77626
[12]	valid_0's auc: 0.775754
[13]	valid_0's auc: 0.775736
[14]	valid_0's auc: 0.775701
Early stopping, best iteration is:
[4]	valid_0's auc: 0.776396
[1]	valid_0's auc: 0.760477
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.760477
[3]	valid_0's auc: 0.760477
[4]	valid_0's auc: 0.760477
[5]	valid_0's auc: 0.761103
[6]	valid_0's auc: 0.761047
[7]	valid_0's auc: 0.761742
[8]	valid_0's auc: 0.762091
[9]	valid_0's auc: 0.761811
[10]	valid_0's auc: 0.762913
[11]	valid_0's auc: 0.764336
[12]	valid_0's auc: 0.764575
[13]	valid_0's auc: 0.76471
[14]	valid_0's auc: 0.764793
[15]	valid_0's auc: 0.764833
[16]	valid_0's auc: 0.764858
[17]	valid_0's auc: 0.765272
[18]	valid_0's auc: 0.765434
[19]	valid_0's auc: 0.765433
[20]	valid_0's auc: 0.765377
[21]	valid_0's auc: 0.766184
[22]	valid_0's auc: 0.766292
[23]	valid_0's auc: 0.766241
[24]	valid_0's auc: 0.766845
[25]	valid_0's auc: 0.766847
[26]	valid_0's auc: 0.76682
[27]	valid_0's auc: 0.766822
[28]	valid_0's auc: 0.766843
[29]	valid_0's auc: 0.767279
[30]	valid_0's auc: 0.767148
[31]	valid_0's auc: 0.771446
[32]	valid_0's auc: 0.771758
[33]	valid_0's auc: 0.772053
[34]	valid_0's auc: 0.772218
[35]	valid_0's auc: 0.77235
[36]	valid_0's auc: 0.772422
[37]	valid_0's auc: 0.772691
[38]	valid_0's auc: 0.772759
[39]	valid_0's auc: 0.77303
[40]	valid_0's auc: 0.773159
[41]	valid_0's auc: 0.773573
[42]	valid_0's auc: 0.773811
[43]	valid_0's auc: 0.774071
[44]	valid_0's auc: 0.775543
[45]	valid_0's auc: 0.775683
[46]	valid_0's auc: 0.775777
[47]	valid_0's auc: 0.776504
[48]	valid_0's auc: 0.776461
[49]	valid_0's auc: 0.776618
[50]	valid_0's auc: 0.776668
[51]	valid_0's auc: 0.776907
[52]	valid_0's auc: 0.777106
[53]	valid_0's auc: 0.777089
[54]	valid_0's auc: 0.77738
[55]	valid_0's auc: 0.777284
[56]	valid_0's auc: 0.777854
[57]	valid_0's auc: 0.777969
[58]	valid_0's auc: 0.777917
[59]	valid_0's auc: 0.77859
[60]	valid_0's auc: 0.778583
[61]	valid_0's auc: 0.778918
[62]	valid_0's auc: 0.779783
[63]	valid_0's auc: 0.779196
[64]	valid_0's auc: 0.780008
[65]	valid_0's auc: 0.780037
[66]	valid_0's auc: 0.781486
[67]	valid_0's auc: 0.781619
[68]	valid_0's auc: 0.781764
[69]	valid_0's auc: 0.781837
[70]	valid_0's auc: 0.782085
[71]	valid_0's auc: 0.782169
[72]	valid_0's auc: 0.782249
[73]	valid_0's auc: 0.782379
[74]	valid_0's auc: 0.782491
[75]	valid_0's auc: 0.782881
[76]	valid_0's auc: 0.783108
[77]	valid_0's auc: 0.783322
[78]	valid_0's auc: 0.78374
[79]	valid_0's auc: 0.783882
[80]	valid_0's auc: 0.784251
[81]	valid_0's auc: 0.784354
[82]	valid_0's auc: 0.78441
[83]	valid_0's auc: 0.784621
[84]	valid_0's auc: 0.784695
[85]	valid_0's auc: 0.784797
[86]	valid_0's auc: 0.784856
[87]	valid_0's auc: 0.785046
[88]	valid_0's auc: 0.785056
[89]	valid_0's auc: 0.785259
[90]	valid_0's auc: 0.785669
[91]	valid_0's auc: 0.785821
[92]	valid_0's auc: 0.786026
[93]	valid_0's auc: 0.786013
[94]	valid_0's auc: 0.786032
[95]	valid_0's auc: 0.786231
[96]	valid_0's auc: 0.786296
[97]	valid_0's auc: 0.78644
[98]	valid_0's auc: 0.786566
[99]	valid_0's auc: 0.786242
[100]	valid_0's auc: 0.786363
Did not meet early stopping. Best iteration is:
[98]	valid_0's auc: 0.786566
[1]	valid_0's auc: 0.7495
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.7495
[3]	valid_0's auc: 0.7495
[4]	valid_0's auc: 0.7495
[5]	valid_0's auc: 0.7495
[6]	valid_0's auc: 0.751338
[7]	valid_0's auc: 0.751338
[8]	valid_0's auc: 0.749299
[9]	valid_0's auc: 0.750152
[10]	valid_0's auc: 0.750327
[11]	valid_0's auc: 0.753588
[12]	valid_0's auc: 0.751726
[13]	valid_0's auc: 0.753588
[14]	valid_0's auc: 0.75362
[15]	valid_0's auc: 0.752114
[16]	valid_0's auc: 0.75397
[17]	valid_0's auc: 0.752302
[18]	valid_0's auc: 0.75454
[19]	valid_0's auc: 0.754551
[20]	valid_0's auc: 0.755186
[21]	valid_0's auc: 0.755188
[22]	valid_0's auc: 0.756006
[23]	valid_0's auc: 0.75601
[24]	valid_0's auc: 0.75633
[25]	valid_0's auc: 0.756265
[26]	valid_0's auc: 0.756053
[27]	valid_0's auc: 0.756274
[28]	valid_0's auc: 0.75762
[29]	valid_0's auc: 0.756782
[30]	valid_0's auc: 0.757634
[31]	valid_0's auc: 0.757675
[32]	valid_0's auc: 0.758008
[33]	valid_0's auc: 0.756743
[34]	valid_0's auc: 0.756602
[35]	valid_0's auc: 0.757746
[36]	valid_0's auc: 0.757017
[37]	valid_0's auc: 0.757261
[38]	valid_0's auc: 0.758645
[39]	valid_0's auc: 0.758549
[40]	valid_0's auc: 0.758784
[41]	valid_0's auc: 0.758708
[42]	valid_0's auc: 0.758747
[43]	valid_0's auc: 0.758709
[44]	valid_0's auc: 0.758743
[45]	valid_0's auc: 0.759398
[46]	valid_0's auc: 0.759397
[47]	valid_0's auc: 0.759717
[48]	valid_0's auc: 0.759915
[49]	valid_0's auc: 0.759838
[50]	valid_0's auc: 0.759863
[51]	valid_0's auc: 0.76599
[52]	valid_0's auc: 0.766056
[53]	valid_0's auc: 0.766283
[54]	valid_0's auc: 0.766718
[55]	valid_0's auc: 0.766806
[56]	valid_0's auc: 0.766793
[57]	valid_0's auc: 0.766794
[58]	valid_0's auc: 0.767103
[59]	valid_0's auc: 0.76718
[60]	valid_0's auc: 0.767451
[61]	valid_0's auc: 0.767644
[62]	valid_0's auc: 0.76792
[63]	valid_0's auc: 0.768226
[64]	valid_0's auc: 0.768386
[65]	valid_0's auc: 0.768722
[66]	valid_0's auc: 0.769498
[67]	valid_0's auc: 0.769579
[68]	valid_0's auc: 0.769717
[69]	valid_0's auc: 0.769656
[70]	valid_0's auc: 0.769725
[71]	valid_0's auc: 0.770004
[72]	valid_0's auc: 0.770071
[73]	valid_0's auc: 0.770156
[74]	valid_0's auc: 0.770421
[75]	valid_0's auc: 0.770445
[76]	valid_0's auc: 0.770643
[77]	valid_0's auc: 0.770704
[78]	valid_0's auc: 0.770812
[79]	valid_0's auc: 0.771044
[80]	valid_0's auc: 0.771391
[81]	valid_0's auc: 0.771437
[82]	valid_0's auc: 0.771472
[83]	valid_0's auc: 0.771566
[84]	valid_0's auc: 0.771689
[85]	valid_0's auc: 0.771736
[86]	valid_0's auc: 0.771801
[87]	valid_0's auc: 0.772011
[88]	valid_0's auc: 0.772728
[89]	valid_0's auc: 0.7728
[90]	valid_0's auc: 0.772898
[91]	valid_0's auc: 0.772956
[92]	valid_0's auc: 0.773111
[93]	valid_0's auc: 0.773019
[94]	valid_0's auc: 0.773114
[95]	valid_0's auc: 0.773192
[96]	valid_0's auc: 0.773362
[97]	valid_0's auc: 0.77347
[98]	valid_0's auc: 0.773557
[99]	valid_0's auc: 0.773817
[100]	valid_0's auc: 0.773901
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.773901
[1]	valid_0's auc: 0.748453
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.749691
[3]	valid_0's auc: 0.751615
[4]	valid_0's auc: 0.751094
[5]	valid_0's auc: 0.752006
[6]	valid_0's auc: 0.751589
[7]	valid_0's auc: 0.756645
[8]	valid_0's auc: 0.756098
[9]	valid_0's auc: 0.756598
[10]	valid_0's auc: 0.756621
[11]	valid_0's auc: 0.756363
[12]	valid_0's auc: 0.756692
[13]	valid_0's auc: 0.756926
[14]	valid_0's auc: 0.757646
[15]	valid_0's auc: 0.75773
[16]	valid_0's auc: 0.761756
[17]	valid_0's auc: 0.763676
[18]	valid_0's auc: 0.763493
[19]	valid_0's auc: 0.765036
[20]	valid_0's auc: 0.765387
[21]	valid_0's auc: 0.765743
[22]	valid_0's auc: 0.76701
[23]	valid_0's auc: 0.768299
[24]	valid_0's auc: 0.768336
[25]	valid_0's auc: 0.768381
[26]	valid_0's auc: 0.768425
[27]	valid_0's auc: 0.768594
[28]	valid_0's auc: 0.768752
[29]	valid_0's auc: 0.76916
[30]	valid_0's auc: 0.769535
[31]	valid_0's auc: 0.769674
[32]	valid_0's auc: 0.76973
[33]	valid_0's auc: 0.76982
[34]	valid_0's auc: 0.769762
[35]	valid_0's auc: 0.769653
[36]	valid_0's auc: 0.769753
[37]	valid_0's auc: 0.769828
[38]	valid_0's auc: 0.769715
[39]	valid_0's auc: 0.769539
[40]	valid_0's auc: 0.769506
[41]	valid_0's auc: 0.769693
[42]	valid_0's auc: 0.769792
[43]	valid_0's auc: 0.769553
[44]	valid_0's auc: 0.769513
[45]	valid_0's auc: 0.769447
[46]	valid_0's auc: 0.770235
[47]	valid_0's auc: 0.770256
[48]	valid_0's auc: 0.770406
[49]	valid_0's auc: 0.770384
[50]	valid_0's auc: 0.770514
[51]	valid_0's auc: 0.770421
[52]	valid_0's auc: 0.770347
[53]	valid_0's auc: 0.770589
[54]	valid_0's auc: 0.770489
[55]	valid_0's auc: 0.770481
[56]	valid_0's auc: 0.770342
[57]	valid_0's auc: 0.770254
[58]	valid_0's auc: 0.770084
[59]	valid_0's auc: 0.770734
[60]	valid_0's auc: 0.770675
[61]	valid_0's auc: 0.77066
[62]	valid_0's auc: 0.770407
[63]	valid_0's auc: 0.770372
[64]	valid_0's auc: 0.770708
[65]	valid_0's auc: 0.770528
[66]	valid_0's auc: 0.770437
[67]	valid_0's auc: 0.770418
[68]	valid_0's auc: 0.77043
[69]	valid_0's auc: 0.770575
Early stopping, best iteration is:
[59]	valid_0's auc: 0.770734
[1]	valid_0's auc: 0.748867
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.762521
[3]	valid_0's auc: 0.764062
[4]	valid_0's auc: 0.763258
[5]	valid_0's auc: 0.762373
[6]	valid_0's auc: 0.764062
[7]	valid_0's auc: 0.763748
[8]	valid_0's auc: 0.763192
[9]	valid_0's auc: 0.762451
[10]	valid_0's auc: 0.762405
[11]	valid_0's auc: 0.762414
[12]	valid_0's auc: 0.762585
[13]	valid_0's auc: 0.764478
[14]	valid_0's auc: 0.763795
[15]	valid_0's auc: 0.763768
[16]	valid_0's auc: 0.765035
[17]	valid_0's auc: 0.764958
[18]	valid_0's auc: 0.764806
[19]	valid_0's auc: 0.76476
[20]	valid_0's auc: 0.764852
[21]	valid_0's auc: 0.765044
[22]	valid_0's auc: 0.765812
[23]	valid_0's auc: 0.766688
[24]	valid_0's auc: 0.766439
[25]	valid_0's auc: 0.766191
[26]	valid_0's auc: 0.765874
[27]	valid_0's auc: 0.766426
[28]	valid_0's auc: 0.766262
[29]	valid_0's auc: 0.767146
[30]	valid_0's auc: 0.767198
[31]	valid_0's auc: 0.767499
[32]	valid_0's auc: 0.767265
[33]	valid_0's auc: 0.767585
[34]	valid_0's auc: 0.767686
[35]	valid_0's auc: 0.767393
[36]	valid_0's auc: 0.76741
[37]	valid_0's auc: 0.767271
[38]	valid_0's auc: 0.767502
[39]	valid_0's auc: 0.767493
[40]	valid_0's auc: 0.768043
[41]	valid_0's auc: 0.768017
[42]	valid_0's auc: 0.767905
[43]	valid_0's auc: 0.768008
[44]	valid_0's auc: 0.767984
[45]	valid_0's auc: 0.76836
[46]	valid_0's auc: 0.768179
[47]	valid_0's auc: 0.768016
[48]	valid_0's auc: 0.768079
[49]	valid_0's auc: 0.768291
[50]	valid_0's auc: 0.768727
[51]	valid_0's auc: 0.768816
[52]	valid_0's auc: 0.768913
[53]	valid_0's auc: 0.769575
[54]	valid_0's auc: 0.769459
[55]	valid_0's auc: 0.769078
[56]	valid_0's auc: 0.769225
[57]	valid_0's auc: 0.769868
[58]	valid_0's auc: 0.769948
[59]	valid_0's auc: 0.770082
[60]	valid_0's auc: 0.770177
[61]	valid_0's auc: 0.770308
[62]	valid_0's auc: 0.770811
[63]	valid_0's auc: 0.771087
[64]	valid_0's auc: 0.771062
[65]	valid_0's auc: 0.771391
[66]	valid_0's auc: 0.771491
[67]	valid_0's auc: 0.771609
[68]	valid_0's auc: 0.771634
[69]	valid_0's auc: 0.771639
[70]	valid_0's auc: 0.77184
[71]	valid_0's auc: 0.772131
[72]	valid_0's auc: 0.772183
[73]	valid_0's auc: 0.772289
[74]	valid_0's auc: 0.772651
[75]	valid_0's auc: 0.772794
[76]	valid_0's auc: 0.772885
[77]	valid_0's auc: 0.77315
[78]	valid_0's auc: 0.773337
[79]	valid_0's auc: 0.773271
[80]	valid_0's auc: 0.77366
[81]	valid_0's auc: 0.773411
[82]	valid_0's auc: 0.773837
[83]	valid_0's auc: 0.773812
[84]	valid_0's auc: 0.774001
[85]	valid_0's auc: 0.774302
[86]	valid_0's auc: 0.774236
[87]	valid_0's auc: 0.774115
[88]	valid_0's auc: 0.77435
[89]	valid_0's auc: 0.774409
[90]	valid_0's auc: 0.774726
[91]	valid_0's auc: 0.774635
[92]	valid_0's auc: 0.774694
[93]	valid_0's auc: 0.774737
[94]	valid_0's auc: 0.774803
[95]	valid_0's auc: 0.775202
[96]	valid_0's auc: 0.775197
[97]	valid_0's auc: 0.775361
[98]	valid_0's auc: 0.775361
[99]	valid_0's auc: 0.775321
[100]	valid_0's auc: 0.775437
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.775437
[1]	valid_0's auc: 0.748286
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.748286
[3]	valid_0's auc: 0.749259
[4]	valid_0's auc: 0.749362
[5]	valid_0's auc: 0.748976
[6]	valid_0's auc: 0.74936
[7]	valid_0's auc: 0.749362
[8]	valid_0's auc: 0.749138
[9]	valid_0's auc: 0.752684
[10]	valid_0's auc: 0.752831
[11]	valid_0's auc: 0.75366
[12]	valid_0's auc: 0.75411
[13]	valid_0's auc: 0.761886
[14]	valid_0's auc: 0.762058
[15]	valid_0's auc: 0.761342
[16]	valid_0's auc: 0.762343
[17]	valid_0's auc: 0.763406
[18]	valid_0's auc: 0.763392
[19]	valid_0's auc: 0.763273
[20]	valid_0's auc: 0.76458
[21]	valid_0's auc: 0.763966
[22]	valid_0's auc: 0.764795
[23]	valid_0's auc: 0.765132
[24]	valid_0's auc: 0.765137
[25]	valid_0's auc: 0.765224
[26]	valid_0's auc: 0.765178
[27]	valid_0's auc: 0.765203
[28]	valid_0's auc: 0.765275
[29]	valid_0's auc: 0.76528
[30]	valid_0's auc: 0.765251
[31]	valid_0's auc: 0.766388
[32]	valid_0's auc: 0.76663
[33]	valid_0's auc: 0.766285
[34]	valid_0's auc: 0.766925
[35]	valid_0's auc: 0.766744
[36]	valid_0's auc: 0.766676
[37]	valid_0's auc: 0.767472
[38]	valid_0's auc: 0.76736
[39]	valid_0's auc: 0.767359
[40]	valid_0's auc: 0.76739
[41]	valid_0's auc: 0.767703
[42]	valid_0's auc: 0.767477
[43]	valid_0's auc: 0.767541
[44]	valid_0's auc: 0.767863
[45]	valid_0's auc: 0.767699
[46]	valid_0's auc: 0.767717
[47]	valid_0's auc: 0.76795
[48]	valid_0's auc: 0.767933
[49]	valid_0's auc: 0.767932
[50]	valid_0's auc: 0.768455
[51]	valid_0's auc: 0.768476
[52]	valid_0's auc: 0.76847
[53]	valid_0's auc: 0.76849
[54]	valid_0's auc: 0.768749
[55]	valid_0's auc: 0.768557
[56]	valid_0's auc: 0.769161
[57]	valid_0's auc: 0.769234
[58]	valid_0's auc: 0.769464
[59]	valid_0's auc: 0.769393
[60]	valid_0's auc: 0.769274
[61]	valid_0's auc: 0.769263
[62]	valid_0's auc: 0.769402
[63]	valid_0's auc: 0.769438
[64]	valid_0's auc: 0.76958
[65]	valid_0's auc: 0.769563
[66]	valid_0's auc: 0.769568
[67]	valid_0's auc: 0.769779
[68]	valid_0's auc: 0.769926
[69]	valid_0's auc: 0.769944
[70]	valid_0's auc: 0.770306
[71]	valid_0's auc: 0.770275
[72]	valid_0's auc: 0.770372
[73]	valid_0's auc: 0.771305
[74]	valid_0's auc: 0.771477
[75]	valid_0's auc: 0.771405
[76]	valid_0's auc: 0.771388
[77]	valid_0's auc: 0.771519
[78]	valid_0's auc: 0.771588
[79]	valid_0's auc: 0.772025
[80]	valid_0's auc: 0.772217
[81]	valid_0's auc: 0.772321
[82]	valid_0's auc: 0.772388
[83]	valid_0's auc: 0.772701
[84]	valid_0's auc: 0.772712
[85]	valid_0's auc: 0.772697
[86]	valid_0's auc: 0.773027
[87]	valid_0's auc: 0.773182
[88]	valid_0's auc: 0.773352
[89]	valid_0's auc: 0.773592
[90]	valid_0's auc: 0.773594
[91]	valid_0's auc: 0.773858
[92]	valid_0's auc: 0.773981
[93]	valid_0's auc: 0.774091
[94]	valid_0's auc: 0.77409
[95]	valid_0's auc: 0.774341
[96]	valid_0's auc: 0.774392
[97]	valid_0's auc: 0.774483
[98]	valid_0's auc: 0.774577
[99]	valid_0's auc: 0.774593
[100]	valid_0's auc: 0.774762
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.774762
[1]	valid_0's auc: 0.757043
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.757043
[3]	valid_0's auc: 0.757043
[4]	valid_0's auc: 0.757043
[5]	valid_0's auc: 0.757043
[6]	valid_0's auc: 0.757043
[7]	valid_0's auc: 0.757043
[8]	valid_0's auc: 0.757043
[9]	valid_0's auc: 0.757043
[10]	valid_0's auc: 0.757043
[11]	valid_0's auc: 0.758625
[12]	valid_0's auc: 0.759292
[13]	valid_0's auc: 0.760187
[14]	valid_0's auc: 0.760275
[15]	valid_0's auc: 0.760962
[16]	valid_0's auc: 0.761238
[17]	valid_0's auc: 0.761211
[18]	valid_0's auc: 0.7613
[19]	valid_0's auc: 0.761395
[20]	valid_0's auc: 0.761334
[21]	valid_0's auc: 0.761432
[22]	valid_0's auc: 0.761521
[23]	valid_0's auc: 0.761593
[24]	valid_0's auc: 0.761667
[25]	valid_0's auc: 0.761781
[26]	valid_0's auc: 0.762145
[27]	valid_0's auc: 0.762147
[28]	valid_0's auc: 0.762012
[29]	valid_0's auc: 0.762029
[30]	valid_0's auc: 0.76205
[31]	valid_0's auc: 0.7623
[32]	valid_0's auc: 0.762442
[33]	valid_0's auc: 0.762826
[34]	valid_0's auc: 0.762886
[35]	valid_0's auc: 0.762847
[36]	valid_0's auc: 0.762868
[37]	valid_0's auc: 0.762924
[38]	valid_0's auc: 0.763281
[39]	valid_0's auc: 0.76328
[40]	valid_0's auc: 0.763148
[41]	valid_0's auc: 0.763354
[42]	valid_0's auc: 0.764574
[43]	valid_0's auc: 0.764801
[44]	valid_0's auc: 0.764879
[45]	valid_0's auc: 0.765002
[46]	valid_0's auc: 0.76495
[47]	valid_0's auc: 0.765176
[48]	valid_0's auc: 0.765224
[49]	valid_0's auc: 0.76525
[50]	valid_0's auc: 0.765488
[51]	valid_0's auc: 0.765558
[52]	valid_0's auc: 0.765553
[53]	valid_0's auc: 0.765663
[54]	valid_0's auc: 0.765647
[55]	valid_0's auc: 0.766129
[56]	valid_0's auc: 0.766111
[57]	valid_0's auc: 0.765746
[58]	valid_0's auc: 0.76651
[59]	valid_0's auc: 0.766548
[60]	valid_0's auc: 0.766745
[61]	valid_0's auc: 0.771389
[62]	valid_0's auc: 0.771551
[63]	valid_0's auc: 0.771669
[64]	valid_0's auc: 0.771897
[65]	valid_0's auc: 0.771982
[66]	valid_0's auc: 0.772083
[67]	valid_0's auc: 0.772517
[68]	valid_0's auc: 0.772629
[69]	valid_0's auc: 0.773429
[70]	valid_0's auc: 0.773844
[71]	valid_0's auc: 0.773989
[72]	valid_0's auc: 0.774098
[73]	valid_0's auc: 0.775047
[74]	valid_0's auc: 0.775309
[75]	valid_0's auc: 0.775532
[76]	valid_0's auc: 0.775546
[77]	valid_0's auc: 0.77575
[78]	valid_0's auc: 0.775891
[79]	valid_0's auc: 0.776002
[80]	valid_0's auc: 0.776212
[81]	valid_0's auc: 0.776492
[82]	valid_0's auc: 0.776616
[83]	valid_0's auc: 0.776783
[84]	valid_0's auc: 0.776891
[85]	valid_0's auc: 0.776979
[86]	valid_0's auc: 0.777022
[87]	valid_0's auc: 0.777174
[88]	valid_0's auc: 0.777333
[89]	valid_0's auc: 0.77745
[90]	valid_0's auc: 0.77752
[91]	valid_0's auc: 0.777574
[92]	valid_0's auc: 0.777644
[93]	valid_0's auc: 0.777846
[94]	valid_0's auc: 0.777805
[95]	valid_0's auc: 0.777825
[96]	valid_0's auc: 0.777932
[97]	valid_0's auc: 0.778095
[98]	valid_0's auc: 0.778152
[99]	valid_0's auc: 0.778222
[100]	valid_0's auc: 0.778231
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.778231
[1]	valid_0's auc: 0.752298
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.759276
[3]	valid_0's auc: 0.759067
[4]	valid_0's auc: 0.759079
[5]	valid_0's auc: 0.759532
[6]	valid_0's auc: 0.759549
[7]	valid_0's auc: 0.761225
[8]	valid_0's auc: 0.761092
[9]	valid_0's auc: 0.761531
[10]	valid_0's auc: 0.761654
[11]	valid_0's auc: 0.76183
[12]	valid_0's auc: 0.761651
[13]	valid_0's auc: 0.761898
[14]	valid_0's auc: 0.763021
[15]	valid_0's auc: 0.763917
[16]	valid_0's auc: 0.764055
[17]	valid_0's auc: 0.764371
[18]	valid_0's auc: 0.764469
[19]	valid_0's auc: 0.764181
[20]	valid_0's auc: 0.764792
[21]	valid_0's auc: 0.764872
[22]	valid_0's auc: 0.766306
[23]	valid_0's auc: 0.766454
[24]	valid_0's auc: 0.766814
[25]	valid_0's auc: 0.770826
[26]	valid_0's auc: 0.77106
[27]	valid_0's auc: 0.771193
[28]	valid_0's auc: 0.771275
[29]	valid_0's auc: 0.771764
[30]	valid_0's auc: 0.772921
[31]	valid_0's auc: 0.772755
[32]	valid_0's auc: 0.772821
[33]	valid_0's auc: 0.773079
[34]	valid_0's auc: 0.774193
[35]	valid_0's auc: 0.774257
[36]	valid_0's auc: 0.774432
[37]	valid_0's auc: 0.774706
[38]	valid_0's auc: 0.775102
[39]	valid_0's auc: 0.775298
[40]	valid_0's auc: 0.775427
[41]	valid_0's auc: 0.77582
[42]	valid_0's auc: 0.775949
[43]	valid_0's auc: 0.776069
[44]	valid_0's auc: 0.776087
[45]	valid_0's auc: 0.776161
[46]	valid_0's auc: 0.776193
[47]	valid_0's auc: 0.776459
[48]	valid_0's auc: 0.776804
[49]	valid_0's auc: 0.777155
[50]	valid_0's auc: 0.777061
[51]	valid_0's auc: 0.777348
[52]	valid_0's auc: 0.777465
[53]	valid_0's auc: 0.777494
[54]	valid_0's auc: 0.777575
[55]	valid_0's auc: 0.777784
[56]	valid_0's auc: 0.777784
[57]	valid_0's auc: 0.777785
[58]	valid_0's auc: 0.77793
[59]	valid_0's auc: 0.778083
[60]	valid_0's auc: 0.778009
[61]	valid_0's auc: 0.778058
[62]	valid_0's auc: 0.778052
[63]	valid_0's auc: 0.777929
[64]	valid_0's auc: 0.778139
[65]	valid_0's auc: 0.778281
[66]	valid_0's auc: 0.778265
[67]	valid_0's auc: 0.778265
[68]	valid_0's auc: 0.77838
[69]	valid_0's auc: 0.778393
[70]	valid_0's auc: 0.778366
[71]	valid_0's auc: 0.778427
[72]	valid_0's auc: 0.778364
[73]	valid_0's auc: 0.778342
[74]	valid_0's auc: 0.778339
[75]	valid_0's auc: 0.778291
[76]	valid_0's auc: 0.778306
[77]	valid_0's auc: 0.778345
[78]	valid_0's auc: 0.778399
[79]	valid_0's auc: 0.778372
[80]	valid_0's auc: 0.778413
[81]	valid_0's auc: 0.778537
[82]	valid_0's auc: 0.77857
[83]	valid_0's auc: 0.778721
[84]	valid_0's auc: 0.778712
[85]	valid_0's auc: 0.778755
[86]	valid_0's auc: 0.778862
[87]	valid_0's auc: 0.779
[88]	valid_0's auc: 0.780026
[89]	valid_0's auc: 0.780002
[90]	valid_0's auc: 0.780006
[91]	valid_0's auc: 0.780188
[92]	valid_0's auc: 0.780227
[93]	valid_0's auc: 0.780227
[94]	valid_0's auc: 0.780421
[95]	valid_0's auc: 0.780511
[96]	valid_0's auc: 0.780642
[97]	valid_0's auc: 0.780721
[98]	valid_0's auc: 0.780743
[99]	valid_0's auc: 0.780898
[100]	valid_0's auc: 0.780924
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.780924
[1]	valid_0's auc: 0.754636
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.7683
[3]	valid_0's auc: 0.76755
[4]	valid_0's auc: 0.771673
[5]	valid_0's auc: 0.772091
[6]	valid_0's auc: 0.771553
[7]	valid_0's auc: 0.771367
[8]	valid_0's auc: 0.770636
[9]	valid_0's auc: 0.771707
[10]	valid_0's auc: 0.771595
[11]	valid_0's auc: 0.771764
[12]	valid_0's auc: 0.771648
[13]	valid_0's auc: 0.771187
[14]	valid_0's auc: 0.77186
[15]	valid_0's auc: 0.771543
Early stopping, best iteration is:
[5]	valid_0's auc: 0.772091
[1]	valid_0's auc: 0.757698
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.757698
[3]	valid_0's auc: 0.758663
[4]	valid_0's auc: 0.757968
[5]	valid_0's auc: 0.757959
[6]	valid_0's auc: 0.757982
[7]	valid_0's auc: 0.758325
[8]	valid_0's auc: 0.758427
[9]	valid_0's auc: 0.760477
[10]	valid_0's auc: 0.761208
[11]	valid_0's auc: 0.762052
[12]	valid_0's auc: 0.762095
[13]	valid_0's auc: 0.762074
[14]	valid_0's auc: 0.762127
[15]	valid_0's auc: 0.762133
[16]	valid_0's auc: 0.762254
[17]	valid_0's auc: 0.762284
[18]	valid_0's auc: 0.761951
[19]	valid_0's auc: 0.761953
[20]	valid_0's auc: 0.762233
[21]	valid_0's auc: 0.763617
[22]	valid_0's auc: 0.763017
[23]	valid_0's auc: 0.76413
[24]	valid_0's auc: 0.764133
[25]	valid_0's auc: 0.764182
[26]	valid_0's auc: 0.764466
[27]	valid_0's auc: 0.76447
[28]	valid_0's auc: 0.764439
[29]	valid_0's auc: 0.764552
[30]	valid_0's auc: 0.764529
[31]	valid_0's auc: 0.764645
[32]	valid_0's auc: 0.76461
[33]	valid_0's auc: 0.764749
[34]	valid_0's auc: 0.764821
[35]	valid_0's auc: 0.764874
[36]	valid_0's auc: 0.765379
[37]	valid_0's auc: 0.76569
[38]	valid_0's auc: 0.771086
[39]	valid_0's auc: 0.771052
[40]	valid_0's auc: 0.770973
[41]	valid_0's auc: 0.770923
[42]	valid_0's auc: 0.771175
[43]	valid_0's auc: 0.771435
[44]	valid_0's auc: 0.771566
[45]	valid_0's auc: 0.771553
[46]	valid_0's auc: 0.77223
[47]	valid_0's auc: 0.772217
[48]	valid_0's auc: 0.772403
[49]	valid_0's auc: 0.772724
[50]	valid_0's auc: 0.772727
[51]	valid_0's auc: 0.772736
[52]	valid_0's auc: 0.772851
[53]	valid_0's auc: 0.772891
[54]	valid_0's auc: 0.773151
[55]	valid_0's auc: 0.773219
[56]	valid_0's auc: 0.773409
[57]	valid_0's auc: 0.773384
[58]	valid_0's auc: 0.773475
[59]	valid_0's auc: 0.773402
[60]	valid_0's auc: 0.773757
[61]	valid_0's auc: 0.773803
[62]	valid_0's auc: 0.774054
[63]	valid_0's auc: 0.774023
[64]	valid_0's auc: 0.774089
[65]	valid_0's auc: 0.774476
[66]	valid_0's auc: 0.774687
[67]	valid_0's auc: 0.774747
[68]	valid_0's auc: 0.774837
[69]	valid_0's auc: 0.774838
[70]	valid_0's auc: 0.775095
[71]	valid_0's auc: 0.775076
[72]	valid_0's auc: 0.775126
[73]	valid_0's auc: 0.775658
[74]	valid_0's auc: 0.775383
[75]	valid_0's auc: 0.776725
[76]	valid_0's auc: 0.776896
[77]	valid_0's auc: 0.777285
[78]	valid_0's auc: 0.777833
[79]	valid_0's auc: 0.778284
[80]	valid_0's auc: 0.778558
[81]	valid_0's auc: 0.778694
[82]	valid_0's auc: 0.778815
[83]	valid_0's auc: 0.779367
[84]	valid_0's auc: 0.779458
[85]	valid_0's auc: 0.779447
[86]	valid_0's auc: 0.779756
[87]	valid_0's auc: 0.779789
[88]	valid_0's auc: 0.779914
[89]	valid_0's auc: 0.780055
[90]	valid_0's auc: 0.780157
[91]	valid_0's auc: 0.780275
[92]	valid_0's auc: 0.780355
[93]	valid_0's auc: 0.780553
[94]	valid_0's auc: 0.78061
[95]	valid_0's auc: 0.780827
[96]	valid_0's auc: 0.780831
[97]	valid_0's auc: 0.781015
[98]	valid_0's auc: 0.781166
[99]	valid_0's auc: 0.781131
[100]	valid_0's auc: 0.781194
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.781194
[1]	valid_0's auc: 0.760154
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.760161
[3]	valid_0's auc: 0.760248
[4]	valid_0's auc: 0.760916
[5]	valid_0's auc: 0.761004
[6]	valid_0's auc: 0.76127
[7]	valid_0's auc: 0.7613
[8]	valid_0's auc: 0.76127
[9]	valid_0's auc: 0.761278
[10]	valid_0's auc: 0.761278
[11]	valid_0's auc: 0.761268
[12]	valid_0's auc: 0.761437
[13]	valid_0's auc: 0.76341
[14]	valid_0's auc: 0.764528
[15]	valid_0's auc: 0.764469
[16]	valid_0's auc: 0.766352
[17]	valid_0's auc: 0.767053
[18]	valid_0's auc: 0.766859
[19]	valid_0's auc: 0.767212
[20]	valid_0's auc: 0.766872
[21]	valid_0's auc: 0.766866
[22]	valid_0's auc: 0.767423
[23]	valid_0's auc: 0.767323
[24]	valid_0's auc: 0.767413
[25]	valid_0's auc: 0.767656
[26]	valid_0's auc: 0.767654
[27]	valid_0's auc: 0.767613
[28]	valid_0's auc: 0.767717
[29]	valid_0's auc: 0.768226
[30]	valid_0's auc: 0.768494
[31]	valid_0's auc: 0.768577
[32]	valid_0's auc: 0.768808
[33]	valid_0's auc: 0.768591
[34]	valid_0's auc: 0.768697
[35]	valid_0's auc: 0.769098
[36]	valid_0's auc: 0.769009
[37]	valid_0's auc: 0.769042
[38]	valid_0's auc: 0.769575
[39]	valid_0's auc: 0.76962
[40]	valid_0's auc: 0.76958
[41]	valid_0's auc: 0.769666
[42]	valid_0's auc: 0.769771
[43]	valid_0's auc: 0.769773
[44]	valid_0's auc: 0.769946
[45]	valid_0's auc: 0.769864
[46]	valid_0's auc: 0.770008
[47]	valid_0's auc: 0.770105
[48]	valid_0's auc: 0.770358
[49]	valid_0's auc: 0.770544
[50]	valid_0's auc: 0.770754
[51]	valid_0's auc: 0.770966
[52]	valid_0's auc: 0.770991
[53]	valid_0's auc: 0.770996
[54]	valid_0's auc: 0.771023
[55]	valid_0's auc: 0.771135
[56]	valid_0's auc: 0.775167
[57]	valid_0's auc: 0.775182
[58]	valid_0's auc: 0.775229
[59]	valid_0's auc: 0.775415
[60]	valid_0's auc: 0.775413
[61]	valid_0's auc: 0.775489
[62]	valid_0's auc: 0.775545
[63]	valid_0's auc: 0.775602
[64]	valid_0's auc: 0.77607
[65]	valid_0's auc: 0.776277
[66]	valid_0's auc: 0.777002
[67]	valid_0's auc: 0.7773
[68]	valid_0's auc: 0.777466
[69]	valid_0's auc: 0.778277
[70]	valid_0's auc: 0.778481
[71]	valid_0's auc: 0.779101
[72]	valid_0's auc: 0.77928
[73]	valid_0's auc: 0.779498
[74]	valid_0's auc: 0.779962
[75]	valid_0's auc: 0.780241
[76]	valid_0's auc: 0.78028
[77]	valid_0's auc: 0.78044
[78]	valid_0's auc: 0.780489
[79]	valid_0's auc: 0.780655
[80]	valid_0's auc: 0.780801
[81]	valid_0's auc: 0.780863
[82]	valid_0's auc: 0.781056
[83]	valid_0's auc: 0.781219
[84]	valid_0's auc: 0.781239
[85]	valid_0's auc: 0.781329
[86]	valid_0's auc: 0.781488
[87]	valid_0's auc: 0.781692
[88]	valid_0's auc: 0.781652
[89]	valid_0's auc: 0.78181
[90]	valid_0's auc: 0.781885
[91]	valid_0's auc: 0.782033
[92]	valid_0's auc: 0.782059
[93]	valid_0's auc: 0.782156
[94]	valid_0's auc: 0.78235
[95]	valid_0's auc: 0.782382
[96]	valid_0's auc: 0.782505
[97]	valid_0's auc: 0.782803
[98]	valid_0's auc: 0.782928
[99]	valid_0's auc: 0.782958
[100]	valid_0's auc: 0.783127
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.783127
[1]	valid_0's auc: 0.759786
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.764657
[3]	valid_0's auc: 0.764758
[4]	valid_0's auc: 0.764375
[5]	valid_0's auc: 0.766985
[6]	valid_0's auc: 0.767135
[7]	valid_0's auc: 0.767228
[8]	valid_0's auc: 0.767524
[9]	valid_0's auc: 0.767575
[10]	valid_0's auc: 0.768459
[11]	valid_0's auc: 0.768472
[12]	valid_0's auc: 0.768308
[13]	valid_0's auc: 0.768742
[14]	valid_0's auc: 0.76912
[15]	valid_0's auc: 0.769523
[16]	valid_0's auc: 0.769527
[17]	valid_0's auc: 0.769747
[18]	valid_0's auc: 0.770015
[19]	valid_0's auc: 0.770021
[20]	valid_0's auc: 0.770216
[21]	valid_0's auc: 0.770394
[22]	valid_0's auc: 0.770491
[23]	valid_0's auc: 0.771372
[24]	valid_0's auc: 0.77146
[25]	valid_0's auc: 0.772478
[26]	valid_0's auc: 0.774667
[27]	valid_0's auc: 0.775156
[28]	valid_0's auc: 0.775394
[29]	valid_0's auc: 0.776522
[30]	valid_0's auc: 0.777059
[31]	valid_0's auc: 0.777769
[32]	valid_0's auc: 0.778398
[33]	valid_0's auc: 0.77855
[34]	valid_0's auc: 0.778852
[35]	valid_0's auc: 0.778991
[36]	valid_0's auc: 0.779339
[37]	valid_0's auc: 0.779584
[38]	valid_0's auc: 0.779981
[39]	valid_0's auc: 0.780291
[40]	valid_0's auc: 0.780597
[41]	valid_0's auc: 0.780625
[42]	valid_0's auc: 0.781065
[43]	valid_0's auc: 0.781104
[44]	valid_0's auc: 0.78122
[45]	valid_0's auc: 0.781282
[46]	valid_0's auc: 0.781359
[47]	valid_0's auc: 0.781428
[48]	valid_0's auc: 0.781493
[49]	valid_0's auc: 0.781566
[50]	valid_0's auc: 0.781573
[51]	valid_0's auc: 0.781657
[52]	valid_0's auc: 0.781654
[53]	valid_0's auc: 0.781991
[54]	valid_0's auc: 0.781977
[55]	valid_0's auc: 0.781937
[56]	valid_0's auc: 0.782062
[57]	valid_0's auc: 0.782204
[58]	valid_0's auc: 0.782183
[59]	valid_0's auc: 0.782284
[60]	valid_0's auc: 0.78236
[61]	valid_0's auc: 0.782299
[62]	valid_0's auc: 0.782241
[63]	valid_0's auc: 0.782255
[64]	valid_0's auc: 0.782392
[65]	valid_0's auc: 0.782954
[66]	valid_0's auc: 0.783
[67]	valid_0's auc: 0.783067
[68]	valid_0's auc: 0.783055
[69]	valid_0's auc: 0.783199
[70]	valid_0's auc: 0.784276
[71]	valid_0's auc: 0.784242
[72]	valid_0's auc: 0.784126
[73]	valid_0's auc: 0.784678
[74]	valid_0's auc: 0.784838
[75]	valid_0's auc: 0.784647
[76]	valid_0's auc: 0.78471
[77]	valid_0's auc: 0.785257
[78]	valid_0's auc: 0.785353
[79]	valid_0's auc: 0.785299
[80]	valid_0's auc: 0.785447
[81]	valid_0's auc: 0.785441
[82]	valid_0's auc: 0.785337
[83]	valid_0's auc: 0.785524
[84]	valid_0's auc: 0.785471
[85]	valid_0's auc: 0.785644
[86]	valid_0's auc: 0.785397
[87]	valid_0's auc: 0.785544
[88]	valid_0's auc: 0.785513
[89]	valid_0's auc: 0.785843
[90]	valid_0's auc: 0.785886
[91]	valid_0's auc: 0.786101
[92]	valid_0's auc: 0.786287
[93]	valid_0's auc: 0.786388
[94]	valid_0's auc: 0.786446
[95]	valid_0's auc: 0.786717
[96]	valid_0's auc: 0.786801
[97]	valid_0's auc: 0.786936
[98]	valid_0's auc: 0.786962
[99]	valid_0's auc: 0.787115
[100]	valid_0's auc: 0.787182
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.787182
[1]	valid_0's auc: 0.762215
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.772303
[3]	valid_0's auc: 0.772421
[4]	valid_0's auc: 0.776396
[5]	valid_0's auc: 0.776041
[6]	valid_0's auc: 0.775312
[7]	valid_0's auc: 0.775295
[8]	valid_0's auc: 0.774564
[9]	valid_0's auc: 0.775867
[10]	valid_0's auc: 0.776069
[11]	valid_0's auc: 0.77626
[12]	valid_0's auc: 0.775754
[13]	valid_0's auc: 0.775736
[14]	valid_0's auc: 0.775701
Early stopping, best iteration is:
[4]	valid_0's auc: 0.776396
[1]	valid_0's auc: 0.760477
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.760477
[3]	valid_0's auc: 0.760477
[4]	valid_0's auc: 0.760477
[5]	valid_0's auc: 0.761103
[6]	valid_0's auc: 0.761047
[7]	valid_0's auc: 0.761742
[8]	valid_0's auc: 0.762091
[9]	valid_0's auc: 0.761811
[10]	valid_0's auc: 0.762913
[11]	valid_0's auc: 0.764336
[12]	valid_0's auc: 0.764575
[13]	valid_0's auc: 0.76471
[14]	valid_0's auc: 0.764793
[15]	valid_0's auc: 0.764833
[16]	valid_0's auc: 0.764858
[17]	valid_0's auc: 0.765272
[18]	valid_0's auc: 0.765434
[19]	valid_0's auc: 0.765433
[20]	valid_0's auc: 0.765377
[21]	valid_0's auc: 0.766184
[22]	valid_0's auc: 0.766292
[23]	valid_0's auc: 0.766241
[24]	valid_0's auc: 0.766845
[25]	valid_0's auc: 0.766847
[26]	valid_0's auc: 0.76682
[27]	valid_0's auc: 0.766822
[28]	valid_0's auc: 0.766843
[29]	valid_0's auc: 0.767279
[30]	valid_0's auc: 0.767148
[31]	valid_0's auc: 0.771446
[32]	valid_0's auc: 0.771758
[33]	valid_0's auc: 0.772053
[34]	valid_0's auc: 0.772218
[35]	valid_0's auc: 0.77235
[36]	valid_0's auc: 0.772422
[37]	valid_0's auc: 0.772691
[38]	valid_0's auc: 0.772759
[39]	valid_0's auc: 0.77303
[40]	valid_0's auc: 0.773159
[41]	valid_0's auc: 0.773573
[42]	valid_0's auc: 0.773811
[43]	valid_0's auc: 0.774071
[44]	valid_0's auc: 0.775543
[45]	valid_0's auc: 0.775683
[46]	valid_0's auc: 0.775777
[47]	valid_0's auc: 0.776504
[48]	valid_0's auc: 0.776461
[49]	valid_0's auc: 0.776618
[50]	valid_0's auc: 0.776668
[51]	valid_0's auc: 0.776907
[52]	valid_0's auc: 0.777106
[53]	valid_0's auc: 0.777089
[54]	valid_0's auc: 0.77738
[55]	valid_0's auc: 0.777284
[56]	valid_0's auc: 0.777854
[57]	valid_0's auc: 0.777969
[58]	valid_0's auc: 0.777917
[59]	valid_0's auc: 0.77859
[60]	valid_0's auc: 0.778583
[61]	valid_0's auc: 0.778918
[62]	valid_0's auc: 0.779783
[63]	valid_0's auc: 0.779196
[64]	valid_0's auc: 0.780008
[65]	valid_0's auc: 0.780037
[66]	valid_0's auc: 0.781486
[67]	valid_0's auc: 0.781619
[68]	valid_0's auc: 0.781764
[69]	valid_0's auc: 0.781837
[70]	valid_0's auc: 0.782085
[71]	valid_0's auc: 0.782169
[72]	valid_0's auc: 0.782249
[73]	valid_0's auc: 0.782379
[74]	valid_0's auc: 0.782491
[75]	valid_0's auc: 0.782881
[76]	valid_0's auc: 0.783108
[77]	valid_0's auc: 0.783322
[78]	valid_0's auc: 0.78374
[79]	valid_0's auc: 0.783882
[80]	valid_0's auc: 0.784251
[81]	valid_0's auc: 0.784354
[82]	valid_0's auc: 0.78441
[83]	valid_0's auc: 0.784621
[84]	valid_0's auc: 0.784695
[85]	valid_0's auc: 0.784797
[86]	valid_0's auc: 0.784856
[87]	valid_0's auc: 0.785046
[88]	valid_0's auc: 0.785056
[89]	valid_0's auc: 0.785259
[90]	valid_0's auc: 0.785669
[91]	valid_0's auc: 0.785821
[92]	valid_0's auc: 0.786026
[93]	valid_0's auc: 0.786013
[94]	valid_0's auc: 0.786032
[95]	valid_0's auc: 0.786231
[96]	valid_0's auc: 0.786296
[97]	valid_0's auc: 0.78644
[98]	valid_0's auc: 0.786566
[99]	valid_0's auc: 0.786242
[100]	valid_0's auc: 0.786363
Did not meet early stopping. Best iteration is:
[98]	valid_0's auc: 0.786566
[1]	valid_0's auc: 0.7495
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.7495
[3]	valid_0's auc: 0.7495
[4]	valid_0's auc: 0.751338
[5]	valid_0's auc: 0.750327
[6]	valid_0's auc: 0.753589
[7]	valid_0's auc: 0.753588
[8]	valid_0's auc: 0.752126
[9]	valid_0's auc: 0.75398
[10]	valid_0's auc: 0.75454
[11]	valid_0's auc: 0.756811
[12]	valid_0's auc: 0.756353
[13]	valid_0's auc: 0.756115
[14]	valid_0's auc: 0.757521
[15]	valid_0's auc: 0.757711
[16]	valid_0's auc: 0.756782
[17]	valid_0's auc: 0.7568
[18]	valid_0's auc: 0.757043
[19]	valid_0's auc: 0.757027
[20]	valid_0's auc: 0.758293
[21]	valid_0's auc: 0.758578
[22]	valid_0's auc: 0.759191
[23]	valid_0's auc: 0.759381
[24]	valid_0's auc: 0.759352
[25]	valid_0's auc: 0.759822
[26]	valid_0's auc: 0.765487
[27]	valid_0's auc: 0.765832
[28]	valid_0's auc: 0.765963
[29]	valid_0's auc: 0.766606
[30]	valid_0's auc: 0.76669
[31]	valid_0's auc: 0.767097
[32]	valid_0's auc: 0.767415
[33]	valid_0's auc: 0.768787
[34]	valid_0's auc: 0.769275
[35]	valid_0's auc: 0.769565
[36]	valid_0's auc: 0.769868
[37]	valid_0's auc: 0.770062
[38]	valid_0's auc: 0.770654
[39]	valid_0's auc: 0.770632
[40]	valid_0's auc: 0.770959
[41]	valid_0's auc: 0.771508
[42]	valid_0's auc: 0.771749
[43]	valid_0's auc: 0.771902
[44]	valid_0's auc: 0.772103
[45]	valid_0's auc: 0.772405
[46]	valid_0's auc: 0.772598
[47]	valid_0's auc: 0.773215
[48]	valid_0's auc: 0.772942
[49]	valid_0's auc: 0.773625
[50]	valid_0's auc: 0.773889
[51]	valid_0's auc: 0.774058
[52]	valid_0's auc: 0.774374
[53]	valid_0's auc: 0.774491
[54]	valid_0's auc: 0.774528
[55]	valid_0's auc: 0.774664
[56]	valid_0's auc: 0.774743
[57]	valid_0's auc: 0.77485
[58]	valid_0's auc: 0.774814
[59]	valid_0's auc: 0.775169
[60]	valid_0's auc: 0.775318
[61]	valid_0's auc: 0.775335
[62]	valid_0's auc: 0.775345
[63]	valid_0's auc: 0.775576
[64]	valid_0's auc: 0.775475
[65]	valid_0's auc: 0.775716
[66]	valid_0's auc: 0.776086
[67]	valid_0's auc: 0.776929
[68]	valid_0's auc: 0.777137
[69]	valid_0's auc: 0.777369
[70]	valid_0's auc: 0.777667
[71]	valid_0's auc: 0.777675
[72]	valid_0's auc: 0.777853
[73]	valid_0's auc: 0.778052
[74]	valid_0's auc: 0.778135
[75]	valid_0's auc: 0.778392
[76]	valid_0's auc: 0.778778
[77]	valid_0's auc: 0.779009
[78]	valid_0's auc: 0.779211
[79]	valid_0's auc: 0.779843
[80]	valid_0's auc: 0.780246
[81]	valid_0's auc: 0.780348
[82]	valid_0's auc: 0.780768
[83]	valid_0's auc: 0.780885
[84]	valid_0's auc: 0.781244
[85]	valid_0's auc: 0.781336
[86]	valid_0's auc: 0.781627
[87]	valid_0's auc: 0.781866
[88]	valid_0's auc: 0.78224
[89]	valid_0's auc: 0.782444
[90]	valid_0's auc: 0.782611
[91]	valid_0's auc: 0.782799
[92]	valid_0's auc: 0.782907
[93]	valid_0's auc: 0.783534
[94]	valid_0's auc: 0.783718
[95]	valid_0's auc: 0.783851
[96]	valid_0's auc: 0.783926
[97]	valid_0's auc: 0.784038
[98]	valid_0's auc: 0.784118
[99]	valid_0's auc: 0.784346
[100]	valid_0's auc: 0.784431
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.784431
[1]	valid_0's auc: 0.748453
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.750923
[3]	valid_0's auc: 0.75159
[4]	valid_0's auc: 0.752015
[5]	valid_0's auc: 0.756742
[6]	valid_0's auc: 0.756571
[7]	valid_0's auc: 0.756802
[8]	valid_0's auc: 0.763296
[9]	valid_0's auc: 0.764728
[10]	valid_0's auc: 0.764674
[11]	valid_0's auc: 0.766835
[12]	valid_0's auc: 0.767924
[13]	valid_0's auc: 0.768343
[14]	valid_0's auc: 0.768391
[15]	valid_0's auc: 0.768883
[16]	valid_0's auc: 0.768783
[17]	valid_0's auc: 0.76873
[18]	valid_0's auc: 0.769956
[19]	valid_0's auc: 0.769605
[20]	valid_0's auc: 0.769301
[21]	valid_0's auc: 0.76932
[22]	valid_0's auc: 0.769179
[23]	valid_0's auc: 0.769169
[24]	valid_0's auc: 0.769532
[25]	valid_0's auc: 0.770316
[26]	valid_0's auc: 0.77018
[27]	valid_0's auc: 0.770184
[28]	valid_0's auc: 0.770105
[29]	valid_0's auc: 0.770008
[30]	valid_0's auc: 0.770257
[31]	valid_0's auc: 0.770322
[32]	valid_0's auc: 0.77028
[33]	valid_0's auc: 0.770151
[34]	valid_0's auc: 0.769933
[35]	valid_0's auc: 0.770547
[36]	valid_0's auc: 0.770253
[37]	valid_0's auc: 0.77016
[38]	valid_0's auc: 0.77015
[39]	valid_0's auc: 0.770165
[40]	valid_0's auc: 0.771132
[41]	valid_0's auc: 0.771074
[42]	valid_0's auc: 0.771942
[43]	valid_0's auc: 0.77229
[44]	valid_0's auc: 0.772747
[45]	valid_0's auc: 0.77291
[46]	valid_0's auc: 0.772846
[47]	valid_0's auc: 0.773313
[48]	valid_0's auc: 0.773607
[49]	valid_0's auc: 0.773905
[50]	valid_0's auc: 0.774035
[51]	valid_0's auc: 0.774545
[52]	valid_0's auc: 0.774976
[53]	valid_0's auc: 0.775024
[54]	valid_0's auc: 0.775006
[55]	valid_0's auc: 0.775036
[56]	valid_0's auc: 0.775146
[57]	valid_0's auc: 0.775301
[58]	valid_0's auc: 0.775343
[59]	valid_0's auc: 0.775362
[60]	valid_0's auc: 0.775339
[61]	valid_0's auc: 0.775374
[62]	valid_0's auc: 0.77556
[63]	valid_0's auc: 0.775483
[64]	valid_0's auc: 0.775588
[65]	valid_0's auc: 0.775736
[66]	valid_0's auc: 0.775864
[67]	valid_0's auc: 0.775917
[68]	valid_0's auc: 0.776221
[69]	valid_0's auc: 0.77633
[70]	valid_0's auc: 0.776398
[71]	valid_0's auc: 0.776545
[72]	valid_0's auc: 0.776576
[73]	valid_0's auc: 0.776699
[74]	valid_0's auc: 0.776906
[75]	valid_0's auc: 0.777059
[76]	valid_0's auc: 0.777275
[77]	valid_0's auc: 0.777435
[78]	valid_0's auc: 0.777753
[79]	valid_0's auc: 0.778025
[80]	valid_0's auc: 0.778391
[81]	valid_0's auc: 0.778697
[82]	valid_0's auc: 0.778793
[83]	valid_0's auc: 0.779371
[84]	valid_0's auc: 0.779458
[85]	valid_0's auc: 0.779461
[86]	valid_0's auc: 0.779618
[87]	valid_0's auc: 0.779756
[88]	valid_0's auc: 0.779823
[89]	valid_0's auc: 0.780088
[90]	valid_0's auc: 0.780299
[91]	valid_0's auc: 0.780479
[92]	valid_0's auc: 0.780664
[93]	valid_0's auc: 0.780709
[94]	valid_0's auc: 0.780866
[95]	valid_0's auc: 0.781006
[96]	valid_0's auc: 0.781236
[97]	valid_0's auc: 0.782116
[98]	valid_0's auc: 0.782326
[99]	valid_0's auc: 0.782534
[100]	valid_0's auc: 0.782663
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.782663
[1]	valid_0's auc: 0.748867
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.764626
[3]	valid_0's auc: 0.764062
[4]	valid_0's auc: 0.763174
[5]	valid_0's auc: 0.762373
[6]	valid_0's auc: 0.762181
[7]	valid_0's auc: 0.764499
[8]	valid_0's auc: 0.765714
[9]	valid_0's auc: 0.764286
[10]	valid_0's auc: 0.764928
[11]	valid_0's auc: 0.766917
[12]	valid_0's auc: 0.766881
[13]	valid_0's auc: 0.766906
[14]	valid_0's auc: 0.766283
[15]	valid_0's auc: 0.7667
[16]	valid_0's auc: 0.766629
[17]	valid_0's auc: 0.76851
[18]	valid_0's auc: 0.767944
[19]	valid_0's auc: 0.767824
[20]	valid_0's auc: 0.767996
[21]	valid_0's auc: 0.768007
[22]	valid_0's auc: 0.768952
[23]	valid_0's auc: 0.76905
[24]	valid_0's auc: 0.768594
[25]	valid_0's auc: 0.768646
[26]	valid_0's auc: 0.76866
[27]	valid_0's auc: 0.769354
[28]	valid_0's auc: 0.769503
[29]	valid_0's auc: 0.769825
[30]	valid_0's auc: 0.770078
[31]	valid_0's auc: 0.769801
[32]	valid_0's auc: 0.77048
[33]	valid_0's auc: 0.770717
[34]	valid_0's auc: 0.771285
[35]	valid_0's auc: 0.771389
[36]	valid_0's auc: 0.77182
[37]	valid_0's auc: 0.77203
[38]	valid_0's auc: 0.7722
[39]	valid_0's auc: 0.773014
[40]	valid_0's auc: 0.773625
[41]	valid_0's auc: 0.773507
[42]	valid_0's auc: 0.773607
[43]	valid_0's auc: 0.773877
[44]	valid_0's auc: 0.774023
[45]	valid_0's auc: 0.774382
[46]	valid_0's auc: 0.77457
[47]	valid_0's auc: 0.774593
[48]	valid_0's auc: 0.774779
[49]	valid_0's auc: 0.774983
[50]	valid_0's auc: 0.775365
[51]	valid_0's auc: 0.775569
[52]	valid_0's auc: 0.775595
[53]	valid_0's auc: 0.775641
[54]	valid_0's auc: 0.775844
[55]	valid_0's auc: 0.776253
[56]	valid_0's auc: 0.776337
[57]	valid_0's auc: 0.776337
[58]	valid_0's auc: 0.776538
[59]	valid_0's auc: 0.776883
[60]	valid_0's auc: 0.776977
[61]	valid_0's auc: 0.777001
[62]	valid_0's auc: 0.777229
[63]	valid_0's auc: 0.777058
[64]	valid_0's auc: 0.777297
[65]	valid_0's auc: 0.777596
[66]	valid_0's auc: 0.777646
[67]	valid_0's auc: 0.777874
[68]	valid_0's auc: 0.778043
[69]	valid_0's auc: 0.778215
[70]	valid_0's auc: 0.778515
[71]	valid_0's auc: 0.778737
[72]	valid_0's auc: 0.778903
[73]	valid_0's auc: 0.778908
[74]	valid_0's auc: 0.779253
[75]	valid_0's auc: 0.779351
[76]	valid_0's auc: 0.779524
[77]	valid_0's auc: 0.779789
[78]	valid_0's auc: 0.77991
[79]	valid_0's auc: 0.780069
[80]	valid_0's auc: 0.780281
[81]	valid_0's auc: 0.780374
[82]	valid_0's auc: 0.780488
[83]	valid_0's auc: 0.78076
[84]	valid_0's auc: 0.780907
[85]	valid_0's auc: 0.780988
[86]	valid_0's auc: 0.781132
[87]	valid_0's auc: 0.781348
[88]	valid_0's auc: 0.781664
[89]	valid_0's auc: 0.781894
[90]	valid_0's auc: 0.781961
[91]	valid_0's auc: 0.782322
[92]	valid_0's auc: 0.782617
[93]	valid_0's auc: 0.782803
[94]	valid_0's auc: 0.783198
[95]	valid_0's auc: 0.783487
[96]	valid_0's auc: 0.783581
[97]	valid_0's auc: 0.783865
[98]	valid_0's auc: 0.784064
[99]	valid_0's auc: 0.784232
[100]	valid_0's auc: 0.784314
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.784314
[1]	valid_0's auc: 0.748286
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.748643
[3]	valid_0's auc: 0.748974
[4]	valid_0's auc: 0.749498
[5]	valid_0's auc: 0.7492
[6]	valid_0's auc: 0.752591
[7]	valid_0's auc: 0.762035
[8]	valid_0's auc: 0.762144
[9]	valid_0's auc: 0.763035
[10]	valid_0's auc: 0.763178
[11]	valid_0's auc: 0.765609
[12]	valid_0's auc: 0.76527
[13]	valid_0's auc: 0.765278
[14]	valid_0's auc: 0.765335
[15]	valid_0's auc: 0.76555
[16]	valid_0's auc: 0.76619
[17]	valid_0's auc: 0.766521
[18]	valid_0's auc: 0.76665
[19]	valid_0's auc: 0.767155
[20]	valid_0's auc: 0.767364
[21]	valid_0's auc: 0.767367
[22]	valid_0's auc: 0.767232
[23]	valid_0's auc: 0.767634
[24]	valid_0's auc: 0.767756
[25]	valid_0's auc: 0.767714
[26]	valid_0's auc: 0.767938
[27]	valid_0's auc: 0.767937
[28]	valid_0's auc: 0.767867
[29]	valid_0's auc: 0.768974
[30]	valid_0's auc: 0.769001
[31]	valid_0's auc: 0.769098
[32]	valid_0's auc: 0.769206
[33]	valid_0's auc: 0.769195
[34]	valid_0's auc: 0.770505
[35]	valid_0's auc: 0.770678
[36]	valid_0's auc: 0.770482
[37]	valid_0's auc: 0.770587
[38]	valid_0's auc: 0.771362
[39]	valid_0's auc: 0.771987
[40]	valid_0's auc: 0.77206
[41]	valid_0's auc: 0.772023
[42]	valid_0's auc: 0.772455
[43]	valid_0's auc: 0.772861
[44]	valid_0's auc: 0.773021
[45]	valid_0's auc: 0.773151
[46]	valid_0's auc: 0.773623
[47]	valid_0's auc: 0.773685
[48]	valid_0's auc: 0.774256
[49]	valid_0's auc: 0.77471
[50]	valid_0's auc: 0.775044
[51]	valid_0's auc: 0.77515
[52]	valid_0's auc: 0.775186
[53]	valid_0's auc: 0.775379
[54]	valid_0's auc: 0.775459
[55]	valid_0's auc: 0.775584
[56]	valid_0's auc: 0.775594
[57]	valid_0's auc: 0.776175
[58]	valid_0's auc: 0.776288
[59]	valid_0's auc: 0.776372
[60]	valid_0's auc: 0.77644
[61]	valid_0's auc: 0.77689
[62]	valid_0's auc: 0.777025
[63]	valid_0's auc: 0.776944
[64]	valid_0's auc: 0.777379
[65]	valid_0's auc: 0.777557
[66]	valid_0's auc: 0.77763
[67]	valid_0's auc: 0.778218
[68]	valid_0's auc: 0.778263
[69]	valid_0's auc: 0.778608
[70]	valid_0's auc: 0.778697
[71]	valid_0's auc: 0.778846
[72]	valid_0's auc: 0.779042
[73]	valid_0's auc: 0.779011
[74]	valid_0's auc: 0.779149
[75]	valid_0's auc: 0.779399
[76]	valid_0's auc: 0.779467
[77]	valid_0's auc: 0.780036
[78]	valid_0's auc: 0.780164
[79]	valid_0's auc: 0.780311
[80]	valid_0's auc: 0.780564
[81]	valid_0's auc: 0.780805
[82]	valid_0's auc: 0.780815
[83]	valid_0's auc: 0.781148
[84]	valid_0's auc: 0.781258
[85]	valid_0's auc: 0.781405
[86]	valid_0's auc: 0.781743
[87]	valid_0's auc: 0.782363
[88]	valid_0's auc: 0.782605
[89]	valid_0's auc: 0.783045
[90]	valid_0's auc: 0.783206
[91]	valid_0's auc: 0.783568
[92]	valid_0's auc: 0.784546
[93]	valid_0's auc: 0.784735
[94]	valid_0's auc: 0.784989
[95]	valid_0's auc: 0.78531
[96]	valid_0's auc: 0.78556
[97]	valid_0's auc: 0.785783
[98]	valid_0's auc: 0.785924
[99]	valid_0's auc: 0.786128
[100]	valid_0's auc: 0.786199
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.786199
[1]	valid_0's auc: 0.757043
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.757043
[3]	valid_0's auc: 0.757043
[4]	valid_0's auc: 0.757043
[5]	valid_0's auc: 0.757043
[6]	valid_0's auc: 0.758795
[7]	valid_0's auc: 0.760473
[8]	valid_0's auc: 0.760857
[9]	valid_0's auc: 0.760818
[10]	valid_0's auc: 0.761516
[11]	valid_0's auc: 0.761615
[12]	valid_0's auc: 0.761778
[13]	valid_0's auc: 0.761776
[14]	valid_0's auc: 0.761917
[15]	valid_0's auc: 0.762317
[16]	valid_0's auc: 0.762692
[17]	valid_0's auc: 0.763123
[18]	valid_0's auc: 0.763131
[19]	valid_0's auc: 0.763036
[20]	valid_0's auc: 0.764345
[21]	valid_0's auc: 0.764762
[22]	valid_0's auc: 0.765017
[23]	valid_0's auc: 0.764949
[24]	valid_0's auc: 0.765362
[25]	valid_0's auc: 0.765393
[26]	valid_0's auc: 0.765275
[27]	valid_0's auc: 0.765369
[28]	valid_0's auc: 0.765993
[29]	valid_0's auc: 0.766367
[30]	valid_0's auc: 0.76649
[31]	valid_0's auc: 0.771829
[32]	valid_0's auc: 0.772173
[33]	valid_0's auc: 0.772607
[34]	valid_0's auc: 0.773638
[35]	valid_0's auc: 0.774082
[36]	valid_0's auc: 0.775134
[37]	valid_0's auc: 0.775495
[38]	valid_0's auc: 0.775634
[39]	valid_0's auc: 0.776018
[40]	valid_0's auc: 0.776233
[41]	valid_0's auc: 0.776621
[42]	valid_0's auc: 0.776757
[43]	valid_0's auc: 0.777105
[44]	valid_0's auc: 0.777099
[45]	valid_0's auc: 0.777408
[46]	valid_0's auc: 0.777497
[47]	valid_0's auc: 0.777592
[48]	valid_0's auc: 0.777721
[49]	valid_0's auc: 0.777873
[50]	valid_0's auc: 0.778025
[51]	valid_0's auc: 0.778302
[52]	valid_0's auc: 0.778468
[53]	valid_0's auc: 0.778545
[54]	valid_0's auc: 0.77858
[55]	valid_0's auc: 0.778557
[56]	valid_0's auc: 0.778648
[57]	valid_0's auc: 0.778994
[58]	valid_0's auc: 0.779233
[59]	valid_0's auc: 0.779359
[60]	valid_0's auc: 0.779403
[61]	valid_0's auc: 0.779547
[62]	valid_0's auc: 0.779773
[63]	valid_0's auc: 0.779932
[64]	valid_0's auc: 0.780137
[65]	valid_0's auc: 0.78018
[66]	valid_0's auc: 0.780524
[67]	valid_0's auc: 0.78058
[68]	valid_0's auc: 0.78153
[69]	valid_0's auc: 0.782383
[70]	valid_0's auc: 0.782668
[71]	valid_0's auc: 0.782834
[72]	valid_0's auc: 0.78304
[73]	valid_0's auc: 0.783262
[74]	valid_0's auc: 0.783359
[75]	valid_0's auc: 0.783643
[76]	valid_0's auc: 0.7839
[77]	valid_0's auc: 0.784005
[78]	valid_0's auc: 0.784293
[79]	valid_0's auc: 0.784603
[80]	valid_0's auc: 0.784668
[81]	valid_0's auc: 0.784854
[82]	valid_0's auc: 0.785038
[83]	valid_0's auc: 0.785206
[84]	valid_0's auc: 0.785577
[85]	valid_0's auc: 0.786132
[86]	valid_0's auc: 0.786435
[87]	valid_0's auc: 0.786708
[88]	valid_0's auc: 0.786935
[89]	valid_0's auc: 0.787197
[90]	valid_0's auc: 0.787367
[91]	valid_0's auc: 0.787432
[92]	valid_0's auc: 0.787705
[93]	valid_0's auc: 0.787809
[94]	valid_0's auc: 0.788053
[95]	valid_0's auc: 0.788254
[96]	valid_0's auc: 0.788411
[97]	valid_0's auc: 0.78849
[98]	valid_0's auc: 0.788737
[99]	valid_0's auc: 0.788979
[100]	valid_0's auc: 0.789034
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.789034
[1]	valid_0's auc: 0.752298
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.759276
[3]	valid_0's auc: 0.759067
[4]	valid_0's auc: 0.760567
[5]	valid_0's auc: 0.759938
[6]	valid_0's auc: 0.762342
[7]	valid_0's auc: 0.762706
[8]	valid_0's auc: 0.763999
[9]	valid_0's auc: 0.764612
[10]	valid_0's auc: 0.764776
[11]	valid_0's auc: 0.765106
[12]	valid_0's auc: 0.766595
[13]	valid_0's auc: 0.771201
[14]	valid_0's auc: 0.771923
[15]	valid_0's auc: 0.772437
[16]	valid_0's auc: 0.772921
[17]	valid_0's auc: 0.77431
[18]	valid_0's auc: 0.774485
[19]	valid_0's auc: 0.774641
[20]	valid_0's auc: 0.775064
[21]	valid_0's auc: 0.775545
[22]	valid_0's auc: 0.77611
[23]	valid_0's auc: 0.776537
[24]	valid_0's auc: 0.776902
[25]	valid_0's auc: 0.777254
[26]	valid_0's auc: 0.777394
[27]	valid_0's auc: 0.777442
[28]	valid_0's auc: 0.777392
[29]	valid_0's auc: 0.77758
[30]	valid_0's auc: 0.777383
[31]	valid_0's auc: 0.777794
[32]	valid_0's auc: 0.777976
[33]	valid_0's auc: 0.778251
[34]	valid_0's auc: 0.777731
[35]	valid_0's auc: 0.77778
[36]	valid_0's auc: 0.777961
[37]	valid_0's auc: 0.778129
[38]	valid_0's auc: 0.778102
[39]	valid_0's auc: 0.778245
[40]	valid_0's auc: 0.778197
[41]	valid_0's auc: 0.778061
[42]	valid_0's auc: 0.778433
[43]	valid_0's auc: 0.778614
[44]	valid_0's auc: 0.77974
[45]	valid_0's auc: 0.779764
[46]	valid_0's auc: 0.779987
[47]	valid_0's auc: 0.780615
[48]	valid_0's auc: 0.780424
[49]	valid_0's auc: 0.780608
[50]	valid_0's auc: 0.780779
[51]	valid_0's auc: 0.780847
[52]	valid_0's auc: 0.78214
[53]	valid_0's auc: 0.78229
[54]	valid_0's auc: 0.782384
[55]	valid_0's auc: 0.782532
[56]	valid_0's auc: 0.782402
[57]	valid_0's auc: 0.782763
[58]	valid_0's auc: 0.782936
[59]	valid_0's auc: 0.783544
[60]	valid_0's auc: 0.783772
[61]	valid_0's auc: 0.783937
[62]	valid_0's auc: 0.78407
[63]	valid_0's auc: 0.784173
[64]	valid_0's auc: 0.784303
[65]	valid_0's auc: 0.784477
[66]	valid_0's auc: 0.784639
[67]	valid_0's auc: 0.784944
[68]	valid_0's auc: 0.78509
[69]	valid_0's auc: 0.785387
[70]	valid_0's auc: 0.785438
[71]	valid_0's auc: 0.785547
[72]	valid_0's auc: 0.785585
[73]	valid_0's auc: 0.785797
[74]	valid_0's auc: 0.785841
[75]	valid_0's auc: 0.786054
[76]	valid_0's auc: 0.786276
[77]	valid_0's auc: 0.78637
[78]	valid_0's auc: 0.786496
[79]	valid_0's auc: 0.786457
[80]	valid_0's auc: 0.786627
[81]	valid_0's auc: 0.786597
[82]	valid_0's auc: 0.78682
[83]	valid_0's auc: 0.786838
[84]	valid_0's auc: 0.786967
[85]	valid_0's auc: 0.787209
[86]	valid_0's auc: 0.787249
[87]	valid_0's auc: 0.78735
[88]	valid_0's auc: 0.787413
[89]	valid_0's auc: 0.787573
[90]	valid_0's auc: 0.787559
[91]	valid_0's auc: 0.787752
[92]	valid_0's auc: 0.787807
[93]	valid_0's auc: 0.788006
[94]	valid_0's auc: 0.788134
[95]	valid_0's auc: 0.788528
[96]	valid_0's auc: 0.788524
[97]	valid_0's auc: 0.788586
[98]	valid_0's auc: 0.788808
[99]	valid_0's auc: 0.789041
[100]	valid_0's auc: 0.789128
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.789128
[1]	valid_0's auc: 0.754636
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.768299
[3]	valid_0's auc: 0.771575
[4]	valid_0's auc: 0.770665
[5]	valid_0's auc: 0.772235
[6]	valid_0's auc: 0.772615
[7]	valid_0's auc: 0.771535
[8]	valid_0's auc: 0.771287
[9]	valid_0's auc: 0.77281
[10]	valid_0's auc: 0.772256
[11]	valid_0's auc: 0.772123
[12]	valid_0's auc: 0.771535
[13]	valid_0's auc: 0.771934
[14]	valid_0's auc: 0.771866
[15]	valid_0's auc: 0.772343
[16]	valid_0's auc: 0.772163
[17]	valid_0's auc: 0.773321
[18]	valid_0's auc: 0.773484
[19]	valid_0's auc: 0.773362
[20]	valid_0's auc: 0.773379
[21]	valid_0's auc: 0.774475
[22]	valid_0's auc: 0.774403
[23]	valid_0's auc: 0.774162
[24]	valid_0's auc: 0.774937
[25]	valid_0's auc: 0.774862
[26]	valid_0's auc: 0.775133
[27]	valid_0's auc: 0.776475
[28]	valid_0's auc: 0.77654
[29]	valid_0's auc: 0.777402
[30]	valid_0's auc: 0.777487
[31]	valid_0's auc: 0.777462
[32]	valid_0's auc: 0.777524
[33]	valid_0's auc: 0.778062
[34]	valid_0's auc: 0.777742
[35]	valid_0's auc: 0.778348
[36]	valid_0's auc: 0.778726
[37]	valid_0's auc: 0.779238
[38]	valid_0's auc: 0.779449
[39]	valid_0's auc: 0.780067
[40]	valid_0's auc: 0.779956
[41]	valid_0's auc: 0.780362
[42]	valid_0's auc: 0.780415
[43]	valid_0's auc: 0.780669
[44]	valid_0's auc: 0.780786
[45]	valid_0's auc: 0.781149
[46]	valid_0's auc: 0.781231
[47]	valid_0's auc: 0.781386
[48]	valid_0's auc: 0.781814
[49]	valid_0's auc: 0.78183
[50]	valid_0's auc: 0.781902
[51]	valid_0's auc: 0.782061
[52]	valid_0's auc: 0.782548
[53]	valid_0's auc: 0.782615
[54]	valid_0's auc: 0.783205
[55]	valid_0's auc: 0.784022
[56]	valid_0's auc: 0.784129
[57]	valid_0's auc: 0.784167
[58]	valid_0's auc: 0.784331
[59]	valid_0's auc: 0.78442
[60]	valid_0's auc: 0.784602
[61]	valid_0's auc: 0.784749
[62]	valid_0's auc: 0.784838
[63]	valid_0's auc: 0.785104
[64]	valid_0's auc: 0.785134
[65]	valid_0's auc: 0.785544
[66]	valid_0's auc: 0.785553
[67]	valid_0's auc: 0.785657
[68]	valid_0's auc: 0.785782
[69]	valid_0's auc: 0.786255
[70]	valid_0's auc: 0.786491
[71]	valid_0's auc: 0.78674
[72]	valid_0's auc: 0.786871
[73]	valid_0's auc: 0.786958
[74]	valid_0's auc: 0.787156
[75]	valid_0's auc: 0.787279
[76]	valid_0's auc: 0.78741
[77]	valid_0's auc: 0.787547
[78]	valid_0's auc: 0.787625
[79]	valid_0's auc: 0.787682
[80]	valid_0's auc: 0.787975
[81]	valid_0's auc: 0.788156
[82]	valid_0's auc: 0.788366
[83]	valid_0's auc: 0.788578
[84]	valid_0's auc: 0.788651
[85]	valid_0's auc: 0.788787
[86]	valid_0's auc: 0.789026
[87]	valid_0's auc: 0.789063
[88]	valid_0's auc: 0.789263
[89]	valid_0's auc: 0.789417
[90]	valid_0's auc: 0.789467
[91]	valid_0's auc: 0.789676
[92]	valid_0's auc: 0.789727
[93]	valid_0's auc: 0.7899
[94]	valid_0's auc: 0.790139
[95]	valid_0's auc: 0.790146
[96]	valid_0's auc: 0.790475
[97]	valid_0's auc: 0.790767
[98]	valid_0's auc: 0.790938
[99]	valid_0's auc: 0.791177
[100]	valid_0's auc: 0.791463
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.791463
[1]	valid_0's auc: 0.757698
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.757948
[3]	valid_0's auc: 0.758047
[4]	valid_0's auc: 0.758753
[5]	valid_0's auc: 0.761685
[6]	valid_0's auc: 0.761875
[7]	valid_0's auc: 0.761995
[8]	valid_0's auc: 0.762136
[9]	valid_0's auc: 0.762108
[10]	valid_0's auc: 0.761805
[11]	valid_0's auc: 0.762741
[12]	valid_0's auc: 0.763883
[13]	valid_0's auc: 0.764031
[14]	valid_0's auc: 0.764196
[15]	valid_0's auc: 0.764499
[16]	valid_0's auc: 0.764659
[17]	valid_0's auc: 0.764847
[18]	valid_0's auc: 0.76478
[19]	valid_0's auc: 0.770272
[20]	valid_0's auc: 0.77068
[21]	valid_0's auc: 0.771017
[22]	valid_0's auc: 0.771228
[23]	valid_0's auc: 0.771304
[24]	valid_0's auc: 0.772486
[25]	valid_0's auc: 0.772417
[26]	valid_0's auc: 0.772656
[27]	valid_0's auc: 0.773035
[28]	valid_0's auc: 0.773496
[29]	valid_0's auc: 0.773718
[30]	valid_0's auc: 0.773737
[31]	valid_0's auc: 0.774286
[32]	valid_0's auc: 0.774156
[33]	valid_0's auc: 0.774419
[34]	valid_0's auc: 0.775035
[35]	valid_0's auc: 0.775044
[36]	valid_0's auc: 0.775165
[37]	valid_0's auc: 0.776456
[38]	valid_0's auc: 0.776438
[39]	valid_0's auc: 0.777963
[40]	valid_0's auc: 0.77818
[41]	valid_0's auc: 0.778965
[42]	valid_0's auc: 0.779395
[43]	valid_0's auc: 0.779625
[44]	valid_0's auc: 0.779797
[45]	valid_0's auc: 0.780017
[46]	valid_0's auc: 0.78014
[47]	valid_0's auc: 0.780392
[48]	valid_0's auc: 0.780686
[49]	valid_0's auc: 0.780941
[50]	valid_0's auc: 0.781057
[51]	valid_0's auc: 0.781848
[52]	valid_0's auc: 0.781988
[53]	valid_0's auc: 0.782345
[54]	valid_0's auc: 0.782747
[55]	valid_0's auc: 0.782932
[56]	valid_0's auc: 0.783047
[57]	valid_0's auc: 0.783231
[58]	valid_0's auc: 0.78329
[59]	valid_0's auc: 0.783585
[60]	valid_0's auc: 0.783793
[61]	valid_0's auc: 0.783722
[62]	valid_0's auc: 0.783898
[63]	valid_0's auc: 0.784182
[64]	valid_0's auc: 0.784347
[65]	valid_0's auc: 0.784653
[66]	valid_0's auc: 0.784791
[67]	valid_0's auc: 0.784979
[68]	valid_0's auc: 0.785101
[69]	valid_0's auc: 0.785438
[70]	valid_0's auc: 0.785619
[71]	valid_0's auc: 0.785697
[72]	valid_0's auc: 0.785884
[73]	valid_0's auc: 0.786129
[74]	valid_0's auc: 0.786262
[75]	valid_0's auc: 0.786407
[76]	valid_0's auc: 0.78648
[77]	valid_0's auc: 0.786622
[78]	valid_0's auc: 0.786888
[79]	valid_0's auc: 0.787043
[80]	valid_0's auc: 0.787584
[81]	valid_0's auc: 0.787626
[82]	valid_0's auc: 0.787669
[83]	valid_0's auc: 0.788003
[84]	valid_0's auc: 0.788134
[85]	valid_0's auc: 0.788419
[86]	valid_0's auc: 0.788445
[87]	valid_0's auc: 0.788551
[88]	valid_0's auc: 0.78872
[89]	valid_0's auc: 0.789029
[90]	valid_0's auc: 0.789218
[91]	valid_0's auc: 0.789434
[92]	valid_0's auc: 0.789952
[93]	valid_0's auc: 0.790057
[94]	valid_0's auc: 0.79029
[95]	valid_0's auc: 0.790682
[96]	valid_0's auc: 0.791119
[97]	valid_0's auc: 0.7913
[98]	valid_0's auc: 0.791605
[99]	valid_0's auc: 0.791886
[100]	valid_0's auc: 0.792353
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.792353
[1]	valid_0's auc: 0.760154
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.760161
[3]	valid_0's auc: 0.761275
[4]	valid_0's auc: 0.761194
[5]	valid_0's auc: 0.761366
[6]	valid_0's auc: 0.761239
[7]	valid_0's auc: 0.764882
[8]	valid_0's auc: 0.766496
[9]	valid_0's auc: 0.766245
[10]	valid_0's auc: 0.766683
[11]	valid_0's auc: 0.767273
[12]	valid_0's auc: 0.767175
[13]	valid_0's auc: 0.767688
[14]	valid_0's auc: 0.767789
[15]	valid_0's auc: 0.768049
[16]	valid_0's auc: 0.768333
[17]	valid_0's auc: 0.768804
[18]	valid_0's auc: 0.768959
[19]	valid_0's auc: 0.76897
[20]	valid_0's auc: 0.769512
[21]	valid_0's auc: 0.769478
[22]	valid_0's auc: 0.769664
[23]	valid_0's auc: 0.769734
[24]	valid_0's auc: 0.769994
[25]	valid_0's auc: 0.770369
[26]	valid_0's auc: 0.770508
[27]	valid_0's auc: 0.770697
[28]	valid_0's auc: 0.770724
[29]	valid_0's auc: 0.771207
[30]	valid_0's auc: 0.77529
[31]	valid_0's auc: 0.775397
[32]	valid_0's auc: 0.776178
[33]	valid_0's auc: 0.776989
[34]	valid_0's auc: 0.778373
[35]	valid_0's auc: 0.778975
[36]	valid_0's auc: 0.779451
[37]	valid_0's auc: 0.779929
[38]	valid_0's auc: 0.780117
[39]	valid_0's auc: 0.780361
[40]	valid_0's auc: 0.780526
[41]	valid_0's auc: 0.780772
[42]	valid_0's auc: 0.781169
[43]	valid_0's auc: 0.781487
[44]	valid_0's auc: 0.781734
[45]	valid_0's auc: 0.781781
[46]	valid_0's auc: 0.782008
[47]	valid_0's auc: 0.782317
[48]	valid_0's auc: 0.782598
[49]	valid_0's auc: 0.782875
[50]	valid_0's auc: 0.783112
[51]	valid_0's auc: 0.783418
[52]	valid_0's auc: 0.783485
[53]	valid_0's auc: 0.783836
[54]	valid_0's auc: 0.783857
[55]	valid_0's auc: 0.783994
[56]	valid_0's auc: 0.784202
[57]	valid_0's auc: 0.784192
[58]	valid_0's auc: 0.784464
[59]	valid_0's auc: 0.785723
[60]	valid_0's auc: 0.78571
[61]	valid_0's auc: 0.78602
[62]	valid_0's auc: 0.7861
[63]	valid_0's auc: 0.786087
[64]	valid_0's auc: 0.786222
[65]	valid_0's auc: 0.786323
[66]	valid_0's auc: 0.786404
[67]	valid_0's auc: 0.786785
[68]	valid_0's auc: 0.786823
[69]	valid_0's auc: 0.787
[70]	valid_0's auc: 0.787013
[71]	valid_0's auc: 0.787273
[72]	valid_0's auc: 0.787597
[73]	valid_0's auc: 0.787749
[74]	valid_0's auc: 0.788552
[75]	valid_0's auc: 0.788607
[76]	valid_0's auc: 0.788887
[77]	valid_0's auc: 0.789096
[78]	valid_0's auc: 0.789653
[79]	valid_0's auc: 0.789639
[80]	valid_0's auc: 0.789877
[81]	valid_0's auc: 0.790263
[82]	valid_0's auc: 0.790395
[83]	valid_0's auc: 0.790688
[84]	valid_0's auc: 0.790733
[85]	valid_0's auc: 0.790967
[86]	valid_0's auc: 0.791113
[87]	valid_0's auc: 0.791254
[88]	valid_0's auc: 0.79137
[89]	valid_0's auc: 0.791645
[90]	valid_0's auc: 0.79172
[91]	valid_0's auc: 0.791966
[92]	valid_0's auc: 0.792199
[93]	valid_0's auc: 0.792516
[94]	valid_0's auc: 0.792844
[95]	valid_0's auc: 0.793063
[96]	valid_0's auc: 0.793207
[97]	valid_0's auc: 0.79347
[98]	valid_0's auc: 0.79384
[99]	valid_0's auc: 0.794088
[100]	valid_0's auc: 0.794243
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.794243
[1]	valid_0's auc: 0.759786
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.764657
[3]	valid_0's auc: 0.765416
[4]	valid_0's auc: 0.765122
[5]	valid_0's auc: 0.768802
[6]	valid_0's auc: 0.768371
[7]	valid_0's auc: 0.768223
[8]	valid_0's auc: 0.769696
[9]	valid_0's auc: 0.770084
[10]	valid_0's auc: 0.76985
[11]	valid_0's auc: 0.770084
[12]	valid_0's auc: 0.772553
[13]	valid_0's auc: 0.772833
[14]	valid_0's auc: 0.774962
[15]	valid_0's auc: 0.776731
[16]	valid_0's auc: 0.777085
[17]	valid_0's auc: 0.778218
[18]	valid_0's auc: 0.778599
[19]	valid_0's auc: 0.779262
[20]	valid_0's auc: 0.779785
[21]	valid_0's auc: 0.780662
[22]	valid_0's auc: 0.780817
[23]	valid_0's auc: 0.781245
[24]	valid_0's auc: 0.781309
[25]	valid_0's auc: 0.78137
[26]	valid_0's auc: 0.781547
[27]	valid_0's auc: 0.781556
[28]	valid_0's auc: 0.781683
[29]	valid_0's auc: 0.781957
[30]	valid_0's auc: 0.782143
[31]	valid_0's auc: 0.78226
[32]	valid_0's auc: 0.782258
[33]	valid_0's auc: 0.782146
[34]	valid_0's auc: 0.783026
[35]	valid_0's auc: 0.782994
[36]	valid_0's auc: 0.783926
[37]	valid_0's auc: 0.783892
[38]	valid_0's auc: 0.784254
[39]	valid_0's auc: 0.78469
[40]	valid_0's auc: 0.784658
[41]	valid_0's auc: 0.784715
[42]	valid_0's auc: 0.784692
[43]	valid_0's auc: 0.785067
[44]	valid_0's auc: 0.784851
[45]	valid_0's auc: 0.78527
[46]	valid_0's auc: 0.786097
[47]	valid_0's auc: 0.786103
[48]	valid_0's auc: 0.786391
[49]	valid_0's auc: 0.786103
[50]	valid_0's auc: 0.786322
[51]	valid_0's auc: 0.786486
[52]	valid_0's auc: 0.787117
[53]	valid_0's auc: 0.787146
[54]	valid_0's auc: 0.787302
[55]	valid_0's auc: 0.787534
[56]	valid_0's auc: 0.787693
[57]	valid_0's auc: 0.787819
[58]	valid_0's auc: 0.787985
[59]	valid_0's auc: 0.788091
[60]	valid_0's auc: 0.788141
[61]	valid_0's auc: 0.788182
[62]	valid_0's auc: 0.788354
[63]	valid_0's auc: 0.788629
[64]	valid_0's auc: 0.788776
[65]	valid_0's auc: 0.788969
[66]	valid_0's auc: 0.78913
[67]	valid_0's auc: 0.789357
[68]	valid_0's auc: 0.78942
[69]	valid_0's auc: 0.789737
[70]	valid_0's auc: 0.789753
[71]	valid_0's auc: 0.790198
[72]	valid_0's auc: 0.790309
[73]	valid_0's auc: 0.790432
[74]	valid_0's auc: 0.790811
[75]	valid_0's auc: 0.791035
[76]	valid_0's auc: 0.791106
[77]	valid_0's auc: 0.791114
[78]	valid_0's auc: 0.791274
[79]	valid_0's auc: 0.791496
[80]	valid_0's auc: 0.79162
[81]	valid_0's auc: 0.791681
[82]	valid_0's auc: 0.791771
[83]	valid_0's auc: 0.791911
[84]	valid_0's auc: 0.792032
[85]	valid_0's auc: 0.792057
[86]	valid_0's auc: 0.792311
[87]	valid_0's auc: 0.792332
[88]	valid_0's auc: 0.792474
[89]	valid_0's auc: 0.792526
[90]	valid_0's auc: 0.79275
[91]	valid_0's auc: 0.792853
[92]	valid_0's auc: 0.792966
[93]	valid_0's auc: 0.793104
[94]	valid_0's auc: 0.793066
[95]	valid_0's auc: 0.79324
[96]	valid_0's auc: 0.793248
[97]	valid_0's auc: 0.793377
[98]	valid_0's auc: 0.79337
[99]	valid_0's auc: 0.793458
[100]	valid_0's auc: 0.793566
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.793566
[1]	valid_0's auc: 0.762215
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.772303
[3]	valid_0's auc: 0.775269
[4]	valid_0's auc: 0.774358
[5]	valid_0's auc: 0.77402
[6]	valid_0's auc: 0.77701
[7]	valid_0's auc: 0.776882
[8]	valid_0's auc: 0.775852
[9]	valid_0's auc: 0.776006
[10]	valid_0's auc: 0.778096
[11]	valid_0's auc: 0.77758
[12]	valid_0's auc: 0.776888
[13]	valid_0's auc: 0.776609
[14]	valid_0's auc: 0.776707
[15]	valid_0's auc: 0.778561
[16]	valid_0's auc: 0.777955
[17]	valid_0's auc: 0.777753
[18]	valid_0's auc: 0.778763
[19]	valid_0's auc: 0.778335
[20]	valid_0's auc: 0.778246
[21]	valid_0's auc: 0.779333
[22]	valid_0's auc: 0.779276
[23]	valid_0's auc: 0.778949
[24]	valid_0's auc: 0.780416
[25]	valid_0's auc: 0.780641
[26]	valid_0's auc: 0.780962
[27]	valid_0's auc: 0.780591
[28]	valid_0's auc: 0.781559
[29]	valid_0's auc: 0.781823
[30]	valid_0's auc: 0.781878
[31]	valid_0's auc: 0.782692
[32]	valid_0's auc: 0.782631
[33]	valid_0's auc: 0.782614
[34]	valid_0's auc: 0.783374
[35]	valid_0's auc: 0.783385
[36]	valid_0's auc: 0.783939
[37]	valid_0's auc: 0.783825
[38]	valid_0's auc: 0.783982
[39]	valid_0's auc: 0.784469
[40]	valid_0's auc: 0.786002
[41]	valid_0's auc: 0.786036
[42]	valid_0's auc: 0.786187
[43]	valid_0's auc: 0.786532
[44]	valid_0's auc: 0.787004
[45]	valid_0's auc: 0.787682
[46]	valid_0's auc: 0.787881
[47]	valid_0's auc: 0.788149
[48]	valid_0's auc: 0.788099
[49]	valid_0's auc: 0.788289
[50]	valid_0's auc: 0.788382
[51]	valid_0's auc: 0.788686
[52]	valid_0's auc: 0.788752
[53]	valid_0's auc: 0.789079
[54]	valid_0's auc: 0.789559
[55]	valid_0's auc: 0.7896
[56]	valid_0's auc: 0.789684
[57]	valid_0's auc: 0.78988
[58]	valid_0's auc: 0.789933
[59]	valid_0's auc: 0.79029
[60]	valid_0's auc: 0.790306
[61]	valid_0's auc: 0.790442
[62]	valid_0's auc: 0.790457
[63]	valid_0's auc: 0.790812
[64]	valid_0's auc: 0.791001
[65]	valid_0's auc: 0.791055
[66]	valid_0's auc: 0.791488
[67]	valid_0's auc: 0.791523
[68]	valid_0's auc: 0.791666
[69]	valid_0's auc: 0.791733
[70]	valid_0's auc: 0.791853
[71]	valid_0's auc: 0.792079
[72]	valid_0's auc: 0.792187
[73]	valid_0's auc: 0.792463
[74]	valid_0's auc: 0.792603
[75]	valid_0's auc: 0.792743
[76]	valid_0's auc: 0.792738
[77]	valid_0's auc: 0.792841
[78]	valid_0's auc: 0.792955
[79]	valid_0's auc: 0.793102
[80]	valid_0's auc: 0.793241
[81]	valid_0's auc: 0.793332
[82]	valid_0's auc: 0.793376
[83]	valid_0's auc: 0.793485
[84]	valid_0's auc: 0.793612
[85]	valid_0's auc: 0.793715
[86]	valid_0's auc: 0.793893
[87]	valid_0's auc: 0.79398
[88]	valid_0's auc: 0.794051
[89]	valid_0's auc: 0.794168
[90]	valid_0's auc: 0.794333
[91]	valid_0's auc: 0.794413
[92]	valid_0's auc: 0.794751
[93]	valid_0's auc: 0.794833
[94]	valid_0's auc: 0.794992
[95]	valid_0's auc: 0.795076
[96]	valid_0's auc: 0.795215
[97]	valid_0's auc: 0.795345
[98]	valid_0's auc: 0.79546
[99]	valid_0's auc: 0.79568
[100]	valid_0's auc: 0.795711
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.795711
[1]	valid_0's auc: 0.760477
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.760477
[3]	valid_0's auc: 0.761049
[4]	valid_0's auc: 0.762803
[5]	valid_0's auc: 0.764426
[6]	valid_0's auc: 0.76481
[7]	valid_0's auc: 0.764804
[8]	valid_0's auc: 0.76486
[9]	valid_0's auc: 0.765372
[10]	valid_0's auc: 0.765314
[11]	valid_0's auc: 0.766307
[12]	valid_0's auc: 0.766468
[13]	valid_0's auc: 0.766473
[14]	valid_0's auc: 0.766878
[15]	valid_0's auc: 0.767154
[16]	valid_0's auc: 0.771358
[17]	valid_0's auc: 0.771997
[18]	valid_0's auc: 0.77235
[19]	valid_0's auc: 0.773067
[20]	valid_0's auc: 0.77316
[21]	valid_0's auc: 0.773603
[22]	valid_0's auc: 0.773982
[23]	valid_0's auc: 0.77437
[24]	valid_0's auc: 0.776197
[25]	valid_0's auc: 0.776418
[26]	valid_0's auc: 0.777278
[27]	valid_0's auc: 0.777414
[28]	valid_0's auc: 0.777641
[29]	valid_0's auc: 0.778541
[30]	valid_0's auc: 0.778702
[31]	valid_0's auc: 0.778776
[32]	valid_0's auc: 0.779424
[33]	valid_0's auc: 0.781895
[34]	valid_0's auc: 0.782156
[35]	valid_0's auc: 0.782471
[36]	valid_0's auc: 0.78238
[37]	valid_0's auc: 0.782442
[38]	valid_0's auc: 0.782723
[39]	valid_0's auc: 0.783611
[40]	valid_0's auc: 0.783936
[41]	valid_0's auc: 0.784154
[42]	valid_0's auc: 0.784687
[43]	valid_0's auc: 0.784856
[44]	valid_0's auc: 0.785017
[45]	valid_0's auc: 0.785474
[46]	valid_0's auc: 0.785745
[47]	valid_0's auc: 0.786119
[48]	valid_0's auc: 0.786212
[49]	valid_0's auc: 0.78609
[50]	valid_0's auc: 0.786079
[51]	valid_0's auc: 0.786433
[52]	valid_0's auc: 0.786534
[53]	valid_0's auc: 0.78668
[54]	valid_0's auc: 0.786942
[55]	valid_0's auc: 0.787322
[56]	valid_0's auc: 0.78739
[57]	valid_0's auc: 0.787454
[58]	valid_0's auc: 0.787588
[59]	valid_0's auc: 0.787963
[60]	valid_0's auc: 0.788005
[61]	valid_0's auc: 0.788147
[62]	valid_0's auc: 0.788331
[63]	valid_0's auc: 0.788597
[64]	valid_0's auc: 0.789037
[65]	valid_0's auc: 0.789185
[66]	valid_0's auc: 0.78975
[67]	valid_0's auc: 0.789884
[68]	valid_0's auc: 0.789872
[69]	valid_0's auc: 0.790233
[70]	valid_0's auc: 0.790472
[71]	valid_0's auc: 0.790604
[72]	valid_0's auc: 0.790855
[73]	valid_0's auc: 0.790919
[74]	valid_0's auc: 0.791129
[75]	valid_0's auc: 0.79132
[76]	valid_0's auc: 0.791519
[77]	valid_0's auc: 0.791642
[78]	valid_0's auc: 0.791805
[79]	valid_0's auc: 0.791894
[80]	valid_0's auc: 0.792156
[81]	valid_0's auc: 0.79229
[82]	valid_0's auc: 0.79234
[83]	valid_0's auc: 0.792528
[84]	valid_0's auc: 0.792846
[85]	valid_0's auc: 0.792962
[86]	valid_0's auc: 0.793151
[87]	valid_0's auc: 0.793373
[88]	valid_0's auc: 0.793494
[89]	valid_0's auc: 0.793592
[90]	valid_0's auc: 0.793746
[91]	valid_0's auc: 0.793984
[92]	valid_0's auc: 0.794137
[93]	valid_0's auc: 0.794296
[94]	valid_0's auc: 0.794592
[95]	valid_0's auc: 0.794654
[96]	valid_0's auc: 0.79484
[97]	valid_0's auc: 0.795148
[98]	valid_0's auc: 0.795326
[99]	valid_0's auc: 0.795606
[100]	valid_0's auc: 0.795675
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.795675
[1]	valid_0's auc: 0.7495
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.7495
[3]	valid_0's auc: 0.7495
[4]	valid_0's auc: 0.751338
[5]	valid_0's auc: 0.750327
[6]	valid_0's auc: 0.753589
[7]	valid_0's auc: 0.753588
[8]	valid_0's auc: 0.752126
[9]	valid_0's auc: 0.75398
[10]	valid_0's auc: 0.75454
[11]	valid_0's auc: 0.756811
[12]	valid_0's auc: 0.756353
[13]	valid_0's auc: 0.756115
[14]	valid_0's auc: 0.757521
[15]	valid_0's auc: 0.757711
[16]	valid_0's auc: 0.756782
[17]	valid_0's auc: 0.7568
[18]	valid_0's auc: 0.757043
[19]	valid_0's auc: 0.757027
[20]	valid_0's auc: 0.758293
[21]	valid_0's auc: 0.758578
[22]	valid_0's auc: 0.759191
[23]	valid_0's auc: 0.759381
[24]	valid_0's auc: 0.759352
[25]	valid_0's auc: 0.759822
[26]	valid_0's auc: 0.765487
[27]	valid_0's auc: 0.765832
[28]	valid_0's auc: 0.765963
[29]	valid_0's auc: 0.766606
[30]	valid_0's auc: 0.76669
[31]	valid_0's auc: 0.767097
[32]	valid_0's auc: 0.767415
[33]	valid_0's auc: 0.768787
[34]	valid_0's auc: 0.769275
[35]	valid_0's auc: 0.769565
[36]	valid_0's auc: 0.769868
[37]	valid_0's auc: 0.770062
[38]	valid_0's auc: 0.770654
[39]	valid_0's auc: 0.770632
[40]	valid_0's auc: 0.770959
[41]	valid_0's auc: 0.771508
[42]	valid_0's auc: 0.771749
[43]	valid_0's auc: 0.771902
[44]	valid_0's auc: 0.772103
[45]	valid_0's auc: 0.772405
[46]	valid_0's auc: 0.772598
[47]	valid_0's auc: 0.773215
[48]	valid_0's auc: 0.772942
[49]	valid_0's auc: 0.773625
[50]	valid_0's auc: 0.773889
[51]	valid_0's auc: 0.774058
[52]	valid_0's auc: 0.774374
[53]	valid_0's auc: 0.774491
[54]	valid_0's auc: 0.774528
[55]	valid_0's auc: 0.774664
[56]	valid_0's auc: 0.774743
[57]	valid_0's auc: 0.77485
[58]	valid_0's auc: 0.774814
[59]	valid_0's auc: 0.775169
[60]	valid_0's auc: 0.775318
[61]	valid_0's auc: 0.775335
[62]	valid_0's auc: 0.775345
[63]	valid_0's auc: 0.775576
[64]	valid_0's auc: 0.775475
[65]	valid_0's auc: 0.775716
[66]	valid_0's auc: 0.776086
[67]	valid_0's auc: 0.776929
[68]	valid_0's auc: 0.777137
[69]	valid_0's auc: 0.777369
[70]	valid_0's auc: 0.777667
[71]	valid_0's auc: 0.777675
[72]	valid_0's auc: 0.777853
[73]	valid_0's auc: 0.778052
[74]	valid_0's auc: 0.778135
[75]	valid_0's auc: 0.778392
[76]	valid_0's auc: 0.778778
[77]	valid_0's auc: 0.779009
[78]	valid_0's auc: 0.779211
[79]	valid_0's auc: 0.779843
[80]	valid_0's auc: 0.780246
[81]	valid_0's auc: 0.780348
[82]	valid_0's auc: 0.780768
[83]	valid_0's auc: 0.780885
[84]	valid_0's auc: 0.781244
[85]	valid_0's auc: 0.781336
[86]	valid_0's auc: 0.781627
[87]	valid_0's auc: 0.781866
[88]	valid_0's auc: 0.78224
[89]	valid_0's auc: 0.782444
[90]	valid_0's auc: 0.782611
[91]	valid_0's auc: 0.782799
[92]	valid_0's auc: 0.782907
[93]	valid_0's auc: 0.783534
[94]	valid_0's auc: 0.783718
[95]	valid_0's auc: 0.783851
[96]	valid_0's auc: 0.783926
[97]	valid_0's auc: 0.784038
[98]	valid_0's auc: 0.784118
[99]	valid_0's auc: 0.784346
[100]	valid_0's auc: 0.784431
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.784431
[1]	valid_0's auc: 0.748453
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.750923
[3]	valid_0's auc: 0.75159
[4]	valid_0's auc: 0.752015
[5]	valid_0's auc: 0.756742
[6]	valid_0's auc: 0.756571
[7]	valid_0's auc: 0.756802
[8]	valid_0's auc: 0.763296
[9]	valid_0's auc: 0.764728
[10]	valid_0's auc: 0.764674
[11]	valid_0's auc: 0.766835
[12]	valid_0's auc: 0.767924
[13]	valid_0's auc: 0.768343
[14]	valid_0's auc: 0.768391
[15]	valid_0's auc: 0.768883
[16]	valid_0's auc: 0.768783
[17]	valid_0's auc: 0.76873
[18]	valid_0's auc: 0.769956
[19]	valid_0's auc: 0.769605
[20]	valid_0's auc: 0.769301
[21]	valid_0's auc: 0.76932
[22]	valid_0's auc: 0.769179
[23]	valid_0's auc: 0.769169
[24]	valid_0's auc: 0.769532
[25]	valid_0's auc: 0.770316
[26]	valid_0's auc: 0.77018
[27]	valid_0's auc: 0.770184
[28]	valid_0's auc: 0.770105
[29]	valid_0's auc: 0.770008
[30]	valid_0's auc: 0.770257
[31]	valid_0's auc: 0.770322
[32]	valid_0's auc: 0.77028
[33]	valid_0's auc: 0.770151
[34]	valid_0's auc: 0.769933
[35]	valid_0's auc: 0.770547
[36]	valid_0's auc: 0.770253
[37]	valid_0's auc: 0.77016
[38]	valid_0's auc: 0.77015
[39]	valid_0's auc: 0.770165
[40]	valid_0's auc: 0.771132
[41]	valid_0's auc: 0.771074
[42]	valid_0's auc: 0.771942
[43]	valid_0's auc: 0.77229
[44]	valid_0's auc: 0.772747
[45]	valid_0's auc: 0.77291
[46]	valid_0's auc: 0.772846
[47]	valid_0's auc: 0.773313
[48]	valid_0's auc: 0.773607
[49]	valid_0's auc: 0.773905
[50]	valid_0's auc: 0.774035
[51]	valid_0's auc: 0.774545
[52]	valid_0's auc: 0.774976
[53]	valid_0's auc: 0.775024
[54]	valid_0's auc: 0.775006
[55]	valid_0's auc: 0.775036
[56]	valid_0's auc: 0.775146
[57]	valid_0's auc: 0.775301
[58]	valid_0's auc: 0.775343
[59]	valid_0's auc: 0.775362
[60]	valid_0's auc: 0.775339
[61]	valid_0's auc: 0.775374
[62]	valid_0's auc: 0.77556
[63]	valid_0's auc: 0.775483
[64]	valid_0's auc: 0.775588
[65]	valid_0's auc: 0.775736
[66]	valid_0's auc: 0.775864
[67]	valid_0's auc: 0.775917
[68]	valid_0's auc: 0.776221
[69]	valid_0's auc: 0.77633
[70]	valid_0's auc: 0.776398
[71]	valid_0's auc: 0.776545
[72]	valid_0's auc: 0.776576
[73]	valid_0's auc: 0.776699
[74]	valid_0's auc: 0.776906
[75]	valid_0's auc: 0.777059
[76]	valid_0's auc: 0.777275
[77]	valid_0's auc: 0.777435
[78]	valid_0's auc: 0.777753
[79]	valid_0's auc: 0.778025
[80]	valid_0's auc: 0.778391
[81]	valid_0's auc: 0.778697
[82]	valid_0's auc: 0.778793
[83]	valid_0's auc: 0.779371
[84]	valid_0's auc: 0.779458
[85]	valid_0's auc: 0.779461
[86]	valid_0's auc: 0.779618
[87]	valid_0's auc: 0.779756
[88]	valid_0's auc: 0.779823
[89]	valid_0's auc: 0.780088
[90]	valid_0's auc: 0.780299
[91]	valid_0's auc: 0.780479
[92]	valid_0's auc: 0.780664
[93]	valid_0's auc: 0.780709
[94]	valid_0's auc: 0.780866
[95]	valid_0's auc: 0.781006
[96]	valid_0's auc: 0.781236
[97]	valid_0's auc: 0.782116
[98]	valid_0's auc: 0.782326
[99]	valid_0's auc: 0.782534
[100]	valid_0's auc: 0.782663
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.782663
[1]	valid_0's auc: 0.748867
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.764626
[3]	valid_0's auc: 0.764062
[4]	valid_0's auc: 0.763174
[5]	valid_0's auc: 0.762373
[6]	valid_0's auc: 0.762181
[7]	valid_0's auc: 0.764499
[8]	valid_0's auc: 0.765714
[9]	valid_0's auc: 0.764286
[10]	valid_0's auc: 0.764928
[11]	valid_0's auc: 0.766917
[12]	valid_0's auc: 0.766881
[13]	valid_0's auc: 0.766906
[14]	valid_0's auc: 0.766283
[15]	valid_0's auc: 0.7667
[16]	valid_0's auc: 0.766629
[17]	valid_0's auc: 0.76851
[18]	valid_0's auc: 0.767944
[19]	valid_0's auc: 0.767824
[20]	valid_0's auc: 0.767996
[21]	valid_0's auc: 0.768007
[22]	valid_0's auc: 0.768952
[23]	valid_0's auc: 0.76905
[24]	valid_0's auc: 0.768594
[25]	valid_0's auc: 0.768646
[26]	valid_0's auc: 0.76866
[27]	valid_0's auc: 0.769354
[28]	valid_0's auc: 0.769503
[29]	valid_0's auc: 0.769825
[30]	valid_0's auc: 0.770078
[31]	valid_0's auc: 0.769801
[32]	valid_0's auc: 0.77048
[33]	valid_0's auc: 0.770717
[34]	valid_0's auc: 0.771285
[35]	valid_0's auc: 0.771389
[36]	valid_0's auc: 0.77182
[37]	valid_0's auc: 0.77203
[38]	valid_0's auc: 0.7722
[39]	valid_0's auc: 0.773014
[40]	valid_0's auc: 0.773625
[41]	valid_0's auc: 0.773507
[42]	valid_0's auc: 0.773607
[43]	valid_0's auc: 0.773877
[44]	valid_0's auc: 0.774023
[45]	valid_0's auc: 0.774382
[46]	valid_0's auc: 0.77457
[47]	valid_0's auc: 0.774593
[48]	valid_0's auc: 0.774779
[49]	valid_0's auc: 0.774983
[50]	valid_0's auc: 0.775365
[51]	valid_0's auc: 0.775569
[52]	valid_0's auc: 0.775595
[53]	valid_0's auc: 0.775641
[54]	valid_0's auc: 0.775844
[55]	valid_0's auc: 0.776253
[56]	valid_0's auc: 0.776337
[57]	valid_0's auc: 0.776337
[58]	valid_0's auc: 0.776538
[59]	valid_0's auc: 0.776883
[60]	valid_0's auc: 0.776977
[61]	valid_0's auc: 0.777001
[62]	valid_0's auc: 0.777229
[63]	valid_0's auc: 0.777058
[64]	valid_0's auc: 0.777297
[65]	valid_0's auc: 0.777596
[66]	valid_0's auc: 0.777646
[67]	valid_0's auc: 0.777874
[68]	valid_0's auc: 0.778043
[69]	valid_0's auc: 0.778215
[70]	valid_0's auc: 0.778515
[71]	valid_0's auc: 0.778737
[72]	valid_0's auc: 0.778903
[73]	valid_0's auc: 0.778908
[74]	valid_0's auc: 0.779253
[75]	valid_0's auc: 0.779351
[76]	valid_0's auc: 0.779524
[77]	valid_0's auc: 0.779789
[78]	valid_0's auc: 0.77991
[79]	valid_0's auc: 0.780069
[80]	valid_0's auc: 0.780281
[81]	valid_0's auc: 0.780374
[82]	valid_0's auc: 0.780488
[83]	valid_0's auc: 0.78076
[84]	valid_0's auc: 0.780907
[85]	valid_0's auc: 0.780988
[86]	valid_0's auc: 0.781132
[87]	valid_0's auc: 0.781348
[88]	valid_0's auc: 0.781664
[89]	valid_0's auc: 0.781894
[90]	valid_0's auc: 0.781961
[91]	valid_0's auc: 0.782322
[92]	valid_0's auc: 0.782617
[93]	valid_0's auc: 0.782803
[94]	valid_0's auc: 0.783198
[95]	valid_0's auc: 0.783487
[96]	valid_0's auc: 0.783581
[97]	valid_0's auc: 0.783865
[98]	valid_0's auc: 0.784064
[99]	valid_0's auc: 0.784232
[100]	valid_0's auc: 0.784314
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.784314
[1]	valid_0's auc: 0.748286
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.748643
[3]	valid_0's auc: 0.748974
[4]	valid_0's auc: 0.749498
[5]	valid_0's auc: 0.7492
[6]	valid_0's auc: 0.752591
[7]	valid_0's auc: 0.762035
[8]	valid_0's auc: 0.762144
[9]	valid_0's auc: 0.763035
[10]	valid_0's auc: 0.763178
[11]	valid_0's auc: 0.765609
[12]	valid_0's auc: 0.76527
[13]	valid_0's auc: 0.765278
[14]	valid_0's auc: 0.765335
[15]	valid_0's auc: 0.76555
[16]	valid_0's auc: 0.76619
[17]	valid_0's auc: 0.766521
[18]	valid_0's auc: 0.76665
[19]	valid_0's auc: 0.767155
[20]	valid_0's auc: 0.767364
[21]	valid_0's auc: 0.767367
[22]	valid_0's auc: 0.767232
[23]	valid_0's auc: 0.767634
[24]	valid_0's auc: 0.767756
[25]	valid_0's auc: 0.767714
[26]	valid_0's auc: 0.767938
[27]	valid_0's auc: 0.767937
[28]	valid_0's auc: 0.767867
[29]	valid_0's auc: 0.768974
[30]	valid_0's auc: 0.769001
[31]	valid_0's auc: 0.769098
[32]	valid_0's auc: 0.769206
[33]	valid_0's auc: 0.769195
[34]	valid_0's auc: 0.770505
[35]	valid_0's auc: 0.770678
[36]	valid_0's auc: 0.770482
[37]	valid_0's auc: 0.770587
[38]	valid_0's auc: 0.771362
[39]	valid_0's auc: 0.771987
[40]	valid_0's auc: 0.77206
[41]	valid_0's auc: 0.772023
[42]	valid_0's auc: 0.772455
[43]	valid_0's auc: 0.772861
[44]	valid_0's auc: 0.773021
[45]	valid_0's auc: 0.773151
[46]	valid_0's auc: 0.773623
[47]	valid_0's auc: 0.773685
[48]	valid_0's auc: 0.774256
[49]	valid_0's auc: 0.77471
[50]	valid_0's auc: 0.775044
[51]	valid_0's auc: 0.77515
[52]	valid_0's auc: 0.775186
[53]	valid_0's auc: 0.775379
[54]	valid_0's auc: 0.775459
[55]	valid_0's auc: 0.775584
[56]	valid_0's auc: 0.775594
[57]	valid_0's auc: 0.776175
[58]	valid_0's auc: 0.776288
[59]	valid_0's auc: 0.776372
[60]	valid_0's auc: 0.77644
[61]	valid_0's auc: 0.77689
[62]	valid_0's auc: 0.777025
[63]	valid_0's auc: 0.776944
[64]	valid_0's auc: 0.777379
[65]	valid_0's auc: 0.777557
[66]	valid_0's auc: 0.77763
[67]	valid_0's auc: 0.778218
[68]	valid_0's auc: 0.778263
[69]	valid_0's auc: 0.778608
[70]	valid_0's auc: 0.778697
[71]	valid_0's auc: 0.778846
[72]	valid_0's auc: 0.779042
[73]	valid_0's auc: 0.779011
[74]	valid_0's auc: 0.779149
[75]	valid_0's auc: 0.779399
[76]	valid_0's auc: 0.779467
[77]	valid_0's auc: 0.780036
[78]	valid_0's auc: 0.780164
[79]	valid_0's auc: 0.780311
[80]	valid_0's auc: 0.780564
[81]	valid_0's auc: 0.780805
[82]	valid_0's auc: 0.780815
[83]	valid_0's auc: 0.781148
[84]	valid_0's auc: 0.781258
[85]	valid_0's auc: 0.781405
[86]	valid_0's auc: 0.781743
[87]	valid_0's auc: 0.782363
[88]	valid_0's auc: 0.782605
[89]	valid_0's auc: 0.783045
[90]	valid_0's auc: 0.783206
[91]	valid_0's auc: 0.783568
[92]	valid_0's auc: 0.784546
[93]	valid_0's auc: 0.784735
[94]	valid_0's auc: 0.784989
[95]	valid_0's auc: 0.78531
[96]	valid_0's auc: 0.78556
[97]	valid_0's auc: 0.785783
[98]	valid_0's auc: 0.785924
[99]	valid_0's auc: 0.786128
[100]	valid_0's auc: 0.786199
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.786199
[1]	valid_0's auc: 0.757043
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.757043
[3]	valid_0's auc: 0.757043
[4]	valid_0's auc: 0.757043
[5]	valid_0's auc: 0.757043
[6]	valid_0's auc: 0.758795
[7]	valid_0's auc: 0.760473
[8]	valid_0's auc: 0.760857
[9]	valid_0's auc: 0.760818
[10]	valid_0's auc: 0.761516
[11]	valid_0's auc: 0.761615
[12]	valid_0's auc: 0.761778
[13]	valid_0's auc: 0.761776
[14]	valid_0's auc: 0.761917
[15]	valid_0's auc: 0.762317
[16]	valid_0's auc: 0.762692
[17]	valid_0's auc: 0.763123
[18]	valid_0's auc: 0.763131
[19]	valid_0's auc: 0.763036
[20]	valid_0's auc: 0.764345
[21]	valid_0's auc: 0.764762
[22]	valid_0's auc: 0.765017
[23]	valid_0's auc: 0.764949
[24]	valid_0's auc: 0.765362
[25]	valid_0's auc: 0.765393
[26]	valid_0's auc: 0.765275
[27]	valid_0's auc: 0.765369
[28]	valid_0's auc: 0.765993
[29]	valid_0's auc: 0.766367
[30]	valid_0's auc: 0.76649
[31]	valid_0's auc: 0.771829
[32]	valid_0's auc: 0.772173
[33]	valid_0's auc: 0.772607
[34]	valid_0's auc: 0.773638
[35]	valid_0's auc: 0.774082
[36]	valid_0's auc: 0.775134
[37]	valid_0's auc: 0.775495
[38]	valid_0's auc: 0.775634
[39]	valid_0's auc: 0.776018
[40]	valid_0's auc: 0.776233
[41]	valid_0's auc: 0.776621
[42]	valid_0's auc: 0.776757
[43]	valid_0's auc: 0.777105
[44]	valid_0's auc: 0.777099
[45]	valid_0's auc: 0.777408
[46]	valid_0's auc: 0.777497
[47]	valid_0's auc: 0.777592
[48]	valid_0's auc: 0.777721
[49]	valid_0's auc: 0.777873
[50]	valid_0's auc: 0.778025
[51]	valid_0's auc: 0.778302
[52]	valid_0's auc: 0.778468
[53]	valid_0's auc: 0.778545
[54]	valid_0's auc: 0.77858
[55]	valid_0's auc: 0.778557
[56]	valid_0's auc: 0.778648
[57]	valid_0's auc: 0.778994
[58]	valid_0's auc: 0.779233
[59]	valid_0's auc: 0.779359
[60]	valid_0's auc: 0.779403
[61]	valid_0's auc: 0.779547
[62]	valid_0's auc: 0.779773
[63]	valid_0's auc: 0.779932
[64]	valid_0's auc: 0.780137
[65]	valid_0's auc: 0.78018
[66]	valid_0's auc: 0.780524
[67]	valid_0's auc: 0.78058
[68]	valid_0's auc: 0.78153
[69]	valid_0's auc: 0.782383
[70]	valid_0's auc: 0.782668
[71]	valid_0's auc: 0.782834
[72]	valid_0's auc: 0.78304
[73]	valid_0's auc: 0.783262
[74]	valid_0's auc: 0.783359
[75]	valid_0's auc: 0.783643
[76]	valid_0's auc: 0.7839
[77]	valid_0's auc: 0.784005
[78]	valid_0's auc: 0.784293
[79]	valid_0's auc: 0.784603
[80]	valid_0's auc: 0.784668
[81]	valid_0's auc: 0.784854
[82]	valid_0's auc: 0.785038
[83]	valid_0's auc: 0.785206
[84]	valid_0's auc: 0.785577
[85]	valid_0's auc: 0.786132
[86]	valid_0's auc: 0.786435
[87]	valid_0's auc: 0.786708
[88]	valid_0's auc: 0.786935
[89]	valid_0's auc: 0.787197
[90]	valid_0's auc: 0.787367
[91]	valid_0's auc: 0.787432
[92]	valid_0's auc: 0.787705
[93]	valid_0's auc: 0.787809
[94]	valid_0's auc: 0.788053
[95]	valid_0's auc: 0.788254
[96]	valid_0's auc: 0.788411
[97]	valid_0's auc: 0.78849
[98]	valid_0's auc: 0.788737
[99]	valid_0's auc: 0.788979
[100]	valid_0's auc: 0.789034
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.789034
[1]	valid_0's auc: 0.752298
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.759276
[3]	valid_0's auc: 0.759067
[4]	valid_0's auc: 0.760567
[5]	valid_0's auc: 0.759938
[6]	valid_0's auc: 0.762342
[7]	valid_0's auc: 0.762706
[8]	valid_0's auc: 0.763999
[9]	valid_0's auc: 0.764612
[10]	valid_0's auc: 0.764776
[11]	valid_0's auc: 0.765106
[12]	valid_0's auc: 0.766595
[13]	valid_0's auc: 0.771201
[14]	valid_0's auc: 0.771923
[15]	valid_0's auc: 0.772437
[16]	valid_0's auc: 0.772921
[17]	valid_0's auc: 0.77431
[18]	valid_0's auc: 0.774485
[19]	valid_0's auc: 0.774641
[20]	valid_0's auc: 0.775064
[21]	valid_0's auc: 0.775545
[22]	valid_0's auc: 0.77611
[23]	valid_0's auc: 0.776537
[24]	valid_0's auc: 0.776902
[25]	valid_0's auc: 0.777254
[26]	valid_0's auc: 0.777394
[27]	valid_0's auc: 0.777442
[28]	valid_0's auc: 0.777392
[29]	valid_0's auc: 0.77758
[30]	valid_0's auc: 0.777383
[31]	valid_0's auc: 0.777794
[32]	valid_0's auc: 0.777976
[33]	valid_0's auc: 0.778251
[34]	valid_0's auc: 0.777731
[35]	valid_0's auc: 0.77778
[36]	valid_0's auc: 0.777961
[37]	valid_0's auc: 0.778129
[38]	valid_0's auc: 0.778102
[39]	valid_0's auc: 0.778245
[40]	valid_0's auc: 0.778197
[41]	valid_0's auc: 0.778061
[42]	valid_0's auc: 0.778433
[43]	valid_0's auc: 0.778614
[44]	valid_0's auc: 0.77974
[45]	valid_0's auc: 0.779764
[46]	valid_0's auc: 0.779987
[47]	valid_0's auc: 0.780615
[48]	valid_0's auc: 0.780424
[49]	valid_0's auc: 0.780608
[50]	valid_0's auc: 0.780779
[51]	valid_0's auc: 0.780847
[52]	valid_0's auc: 0.78214
[53]	valid_0's auc: 0.78229
[54]	valid_0's auc: 0.782384
[55]	valid_0's auc: 0.782532
[56]	valid_0's auc: 0.782402
[57]	valid_0's auc: 0.782763
[58]	valid_0's auc: 0.782936
[59]	valid_0's auc: 0.783544
[60]	valid_0's auc: 0.783772
[61]	valid_0's auc: 0.783937
[62]	valid_0's auc: 0.78407
[63]	valid_0's auc: 0.784173
[64]	valid_0's auc: 0.784303
[65]	valid_0's auc: 0.784477
[66]	valid_0's auc: 0.784639
[67]	valid_0's auc: 0.784944
[68]	valid_0's auc: 0.78509
[69]	valid_0's auc: 0.785387
[70]	valid_0's auc: 0.785438
[71]	valid_0's auc: 0.785547
[72]	valid_0's auc: 0.785585
[73]	valid_0's auc: 0.785797
[74]	valid_0's auc: 0.785841
[75]	valid_0's auc: 0.786054
[76]	valid_0's auc: 0.786276
[77]	valid_0's auc: 0.78637
[78]	valid_0's auc: 0.786496
[79]	valid_0's auc: 0.786457
[80]	valid_0's auc: 0.786627
[81]	valid_0's auc: 0.786597
[82]	valid_0's auc: 0.78682
[83]	valid_0's auc: 0.786838
[84]	valid_0's auc: 0.786967
[85]	valid_0's auc: 0.787209
[86]	valid_0's auc: 0.787249
[87]	valid_0's auc: 0.78735
[88]	valid_0's auc: 0.787413
[89]	valid_0's auc: 0.787573
[90]	valid_0's auc: 0.787559
[91]	valid_0's auc: 0.787752
[92]	valid_0's auc: 0.787807
[93]	valid_0's auc: 0.788006
[94]	valid_0's auc: 0.788134
[95]	valid_0's auc: 0.788528
[96]	valid_0's auc: 0.788524
[97]	valid_0's auc: 0.788586
[98]	valid_0's auc: 0.788808
[99]	valid_0's auc: 0.789041
[100]	valid_0's auc: 0.789128
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.789128
[1]	valid_0's auc: 0.754636
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.768299
[3]	valid_0's auc: 0.771575
[4]	valid_0's auc: 0.770665
[5]	valid_0's auc: 0.772235
[6]	valid_0's auc: 0.772615
[7]	valid_0's auc: 0.771535
[8]	valid_0's auc: 0.771287
[9]	valid_0's auc: 0.77281
[10]	valid_0's auc: 0.772256
[11]	valid_0's auc: 0.772123
[12]	valid_0's auc: 0.771535
[13]	valid_0's auc: 0.771934
[14]	valid_0's auc: 0.771866
[15]	valid_0's auc: 0.772343
[16]	valid_0's auc: 0.772163
[17]	valid_0's auc: 0.773321
[18]	valid_0's auc: 0.773484
[19]	valid_0's auc: 0.773362
[20]	valid_0's auc: 0.773379
[21]	valid_0's auc: 0.774475
[22]	valid_0's auc: 0.774403
[23]	valid_0's auc: 0.774162
[24]	valid_0's auc: 0.774937
[25]	valid_0's auc: 0.774862
[26]	valid_0's auc: 0.775133
[27]	valid_0's auc: 0.776475
[28]	valid_0's auc: 0.77654
[29]	valid_0's auc: 0.777402
[30]	valid_0's auc: 0.777487
[31]	valid_0's auc: 0.777462
[32]	valid_0's auc: 0.777524
[33]	valid_0's auc: 0.778062
[34]	valid_0's auc: 0.777742
[35]	valid_0's auc: 0.778348
[36]	valid_0's auc: 0.778726
[37]	valid_0's auc: 0.779238
[38]	valid_0's auc: 0.779449
[39]	valid_0's auc: 0.780067
[40]	valid_0's auc: 0.779956
[41]	valid_0's auc: 0.780362
[42]	valid_0's auc: 0.780415
[43]	valid_0's auc: 0.780669
[44]	valid_0's auc: 0.780786
[45]	valid_0's auc: 0.781149
[46]	valid_0's auc: 0.781231
[47]	valid_0's auc: 0.781386
[48]	valid_0's auc: 0.781814
[49]	valid_0's auc: 0.78183
[50]	valid_0's auc: 0.781902
[51]	valid_0's auc: 0.782061
[52]	valid_0's auc: 0.782548
[53]	valid_0's auc: 0.782615
[54]	valid_0's auc: 0.783205
[55]	valid_0's auc: 0.784022
[56]	valid_0's auc: 0.784129
[57]	valid_0's auc: 0.784167
[58]	valid_0's auc: 0.784331
[59]	valid_0's auc: 0.78442
[60]	valid_0's auc: 0.784602
[61]	valid_0's auc: 0.784749
[62]	valid_0's auc: 0.784838
[63]	valid_0's auc: 0.785104
[64]	valid_0's auc: 0.785134
[65]	valid_0's auc: 0.785544
[66]	valid_0's auc: 0.785553
[67]	valid_0's auc: 0.785657
[68]	valid_0's auc: 0.785782
[69]	valid_0's auc: 0.786255
[70]	valid_0's auc: 0.786491
[71]	valid_0's auc: 0.78674
[72]	valid_0's auc: 0.786871
[73]	valid_0's auc: 0.786958
[74]	valid_0's auc: 0.787156
[75]	valid_0's auc: 0.787279
[76]	valid_0's auc: 0.78741
[77]	valid_0's auc: 0.787547
[78]	valid_0's auc: 0.787625
[79]	valid_0's auc: 0.787682
[80]	valid_0's auc: 0.787975
[81]	valid_0's auc: 0.788156
[82]	valid_0's auc: 0.788366
[83]	valid_0's auc: 0.788578
[84]	valid_0's auc: 0.788651
[85]	valid_0's auc: 0.788787
[86]	valid_0's auc: 0.789026
[87]	valid_0's auc: 0.789063
[88]	valid_0's auc: 0.789263
[89]	valid_0's auc: 0.789417
[90]	valid_0's auc: 0.789467
[91]	valid_0's auc: 0.789676
[92]	valid_0's auc: 0.789727
[93]	valid_0's auc: 0.7899
[94]	valid_0's auc: 0.790139
[95]	valid_0's auc: 0.790146
[96]	valid_0's auc: 0.790475
[97]	valid_0's auc: 0.790767
[98]	valid_0's auc: 0.790938
[99]	valid_0's auc: 0.791177
[100]	valid_0's auc: 0.791463
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.791463
[1]	valid_0's auc: 0.757698
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.757948
[3]	valid_0's auc: 0.758047
[4]	valid_0's auc: 0.758753
[5]	valid_0's auc: 0.761685
[6]	valid_0's auc: 0.761875
[7]	valid_0's auc: 0.761995
[8]	valid_0's auc: 0.762136
[9]	valid_0's auc: 0.762108
[10]	valid_0's auc: 0.761805
[11]	valid_0's auc: 0.762741
[12]	valid_0's auc: 0.763883
[13]	valid_0's auc: 0.764031
[14]	valid_0's auc: 0.764196
[15]	valid_0's auc: 0.764499
[16]	valid_0's auc: 0.764659
[17]	valid_0's auc: 0.764847
[18]	valid_0's auc: 0.76478
[19]	valid_0's auc: 0.770272
[20]	valid_0's auc: 0.77068
[21]	valid_0's auc: 0.771017
[22]	valid_0's auc: 0.771228
[23]	valid_0's auc: 0.771304
[24]	valid_0's auc: 0.772486
[25]	valid_0's auc: 0.772417
[26]	valid_0's auc: 0.772656
[27]	valid_0's auc: 0.773035
[28]	valid_0's auc: 0.773496
[29]	valid_0's auc: 0.773718
[30]	valid_0's auc: 0.773737
[31]	valid_0's auc: 0.774286
[32]	valid_0's auc: 0.774156
[33]	valid_0's auc: 0.774419
[34]	valid_0's auc: 0.775035
[35]	valid_0's auc: 0.775044
[36]	valid_0's auc: 0.775165
[37]	valid_0's auc: 0.776456
[38]	valid_0's auc: 0.776438
[39]	valid_0's auc: 0.777963
[40]	valid_0's auc: 0.77818
[41]	valid_0's auc: 0.778965
[42]	valid_0's auc: 0.779395
[43]	valid_0's auc: 0.779625
[44]	valid_0's auc: 0.779797
[45]	valid_0's auc: 0.780017
[46]	valid_0's auc: 0.78014
[47]	valid_0's auc: 0.780392
[48]	valid_0's auc: 0.780686
[49]	valid_0's auc: 0.780941
[50]	valid_0's auc: 0.781057
[51]	valid_0's auc: 0.781848
[52]	valid_0's auc: 0.781988
[53]	valid_0's auc: 0.782345
[54]	valid_0's auc: 0.782747
[55]	valid_0's auc: 0.782932
[56]	valid_0's auc: 0.783047
[57]	valid_0's auc: 0.783231
[58]	valid_0's auc: 0.78329
[59]	valid_0's auc: 0.783585
[60]	valid_0's auc: 0.783793
[61]	valid_0's auc: 0.783722
[62]	valid_0's auc: 0.783898
[63]	valid_0's auc: 0.784182
[64]	valid_0's auc: 0.784347
[65]	valid_0's auc: 0.784653
[66]	valid_0's auc: 0.784791
[67]	valid_0's auc: 0.784979
[68]	valid_0's auc: 0.785101
[69]	valid_0's auc: 0.785438
[70]	valid_0's auc: 0.785619
[71]	valid_0's auc: 0.785697
[72]	valid_0's auc: 0.785884
[73]	valid_0's auc: 0.786129
[74]	valid_0's auc: 0.786262
[75]	valid_0's auc: 0.786407
[76]	valid_0's auc: 0.78648
[77]	valid_0's auc: 0.786622
[78]	valid_0's auc: 0.786888
[79]	valid_0's auc: 0.787043
[80]	valid_0's auc: 0.787584
[81]	valid_0's auc: 0.787626
[82]	valid_0's auc: 0.787669
[83]	valid_0's auc: 0.788003
[84]	valid_0's auc: 0.788134
[85]	valid_0's auc: 0.788419
[86]	valid_0's auc: 0.788445
[87]	valid_0's auc: 0.788551
[88]	valid_0's auc: 0.78872
[89]	valid_0's auc: 0.789029
[90]	valid_0's auc: 0.789218
[91]	valid_0's auc: 0.789434
[92]	valid_0's auc: 0.789952
[93]	valid_0's auc: 0.790057
[94]	valid_0's auc: 0.79029
[95]	valid_0's auc: 0.790682
[96]	valid_0's auc: 0.791119
[97]	valid_0's auc: 0.7913
[98]	valid_0's auc: 0.791605
[99]	valid_0's auc: 0.791886
[100]	valid_0's auc: 0.792353
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.792353
[1]	valid_0's auc: 0.760154
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.760161
[3]	valid_0's auc: 0.761275
[4]	valid_0's auc: 0.761194
[5]	valid_0's auc: 0.761366
[6]	valid_0's auc: 0.761239
[7]	valid_0's auc: 0.764882
[8]	valid_0's auc: 0.766496
[9]	valid_0's auc: 0.766245
[10]	valid_0's auc: 0.766683
[11]	valid_0's auc: 0.767273
[12]	valid_0's auc: 0.767175
[13]	valid_0's auc: 0.767688
[14]	valid_0's auc: 0.767789
[15]	valid_0's auc: 0.768049
[16]	valid_0's auc: 0.768333
[17]	valid_0's auc: 0.768804
[18]	valid_0's auc: 0.768959
[19]	valid_0's auc: 0.76897
[20]	valid_0's auc: 0.769512
[21]	valid_0's auc: 0.769478
[22]	valid_0's auc: 0.769664
[23]	valid_0's auc: 0.769734
[24]	valid_0's auc: 0.769994
[25]	valid_0's auc: 0.770369
[26]	valid_0's auc: 0.770508
[27]	valid_0's auc: 0.770697
[28]	valid_0's auc: 0.770724
[29]	valid_0's auc: 0.771207
[30]	valid_0's auc: 0.77529
[31]	valid_0's auc: 0.775397
[32]	valid_0's auc: 0.776178
[33]	valid_0's auc: 0.776989
[34]	valid_0's auc: 0.778373
[35]	valid_0's auc: 0.778975
[36]	valid_0's auc: 0.779451
[37]	valid_0's auc: 0.779929
[38]	valid_0's auc: 0.780117
[39]	valid_0's auc: 0.780361
[40]	valid_0's auc: 0.780526
[41]	valid_0's auc: 0.780772
[42]	valid_0's auc: 0.781169
[43]	valid_0's auc: 0.781487
[44]	valid_0's auc: 0.781734
[45]	valid_0's auc: 0.781781
[46]	valid_0's auc: 0.782008
[47]	valid_0's auc: 0.782317
[48]	valid_0's auc: 0.782598
[49]	valid_0's auc: 0.782875
[50]	valid_0's auc: 0.783112
[51]	valid_0's auc: 0.783418
[52]	valid_0's auc: 0.783485
[53]	valid_0's auc: 0.783836
[54]	valid_0's auc: 0.783857
[55]	valid_0's auc: 0.783994
[56]	valid_0's auc: 0.784202
[57]	valid_0's auc: 0.784192
[58]	valid_0's auc: 0.784464
[59]	valid_0's auc: 0.785723
[60]	valid_0's auc: 0.78571
[61]	valid_0's auc: 0.78602
[62]	valid_0's auc: 0.7861
[63]	valid_0's auc: 0.786087
[64]	valid_0's auc: 0.786222
[65]	valid_0's auc: 0.786323
[66]	valid_0's auc: 0.786404
[67]	valid_0's auc: 0.786785
[68]	valid_0's auc: 0.786823
[69]	valid_0's auc: 0.787
[70]	valid_0's auc: 0.787013
[71]	valid_0's auc: 0.787273
[72]	valid_0's auc: 0.787597
[73]	valid_0's auc: 0.787749
[74]	valid_0's auc: 0.788552
[75]	valid_0's auc: 0.788607
[76]	valid_0's auc: 0.788887
[77]	valid_0's auc: 0.789096
[78]	valid_0's auc: 0.789653
[79]	valid_0's auc: 0.789639
[80]	valid_0's auc: 0.789877
[81]	valid_0's auc: 0.790263
[82]	valid_0's auc: 0.790395
[83]	valid_0's auc: 0.790688
[84]	valid_0's auc: 0.790733
[85]	valid_0's auc: 0.790967
[86]	valid_0's auc: 0.791113
[87]	valid_0's auc: 0.791254
[88]	valid_0's auc: 0.79137
[89]	valid_0's auc: 0.791645
[90]	valid_0's auc: 0.79172
[91]	valid_0's auc: 0.791966
[92]	valid_0's auc: 0.792199
[93]	valid_0's auc: 0.792516
[94]	valid_0's auc: 0.792844
[95]	valid_0's auc: 0.793063
[96]	valid_0's auc: 0.793207
[97]	valid_0's auc: 0.79347
[98]	valid_0's auc: 0.79384
[99]	valid_0's auc: 0.794088
[100]	valid_0's auc: 0.794243
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.794243
[1]	valid_0's auc: 0.759786
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.764657
[3]	valid_0's auc: 0.765416
[4]	valid_0's auc: 0.765122
[5]	valid_0's auc: 0.768802
[6]	valid_0's auc: 0.768371
[7]	valid_0's auc: 0.768223
[8]	valid_0's auc: 0.769696
[9]	valid_0's auc: 0.770084
[10]	valid_0's auc: 0.76985
[11]	valid_0's auc: 0.770084
[12]	valid_0's auc: 0.772553
[13]	valid_0's auc: 0.772833
[14]	valid_0's auc: 0.774962
[15]	valid_0's auc: 0.776731
[16]	valid_0's auc: 0.777085
[17]	valid_0's auc: 0.778218
[18]	valid_0's auc: 0.778599
[19]	valid_0's auc: 0.779262
[20]	valid_0's auc: 0.779785
[21]	valid_0's auc: 0.780662
[22]	valid_0's auc: 0.780817
[23]	valid_0's auc: 0.781245
[24]	valid_0's auc: 0.781309
[25]	valid_0's auc: 0.78137
[26]	valid_0's auc: 0.781547
[27]	valid_0's auc: 0.781556
[28]	valid_0's auc: 0.781683
[29]	valid_0's auc: 0.781957
[30]	valid_0's auc: 0.782143
[31]	valid_0's auc: 0.78226
[32]	valid_0's auc: 0.782258
[33]	valid_0's auc: 0.782146
[34]	valid_0's auc: 0.783026
[35]	valid_0's auc: 0.782994
[36]	valid_0's auc: 0.783926
[37]	valid_0's auc: 0.783892
[38]	valid_0's auc: 0.784254
[39]	valid_0's auc: 0.78469
[40]	valid_0's auc: 0.784658
[41]	valid_0's auc: 0.784715
[42]	valid_0's auc: 0.784692
[43]	valid_0's auc: 0.785067
[44]	valid_0's auc: 0.784851
[45]	valid_0's auc: 0.78527
[46]	valid_0's auc: 0.786097
[47]	valid_0's auc: 0.786103
[48]	valid_0's auc: 0.786391
[49]	valid_0's auc: 0.786103
[50]	valid_0's auc: 0.786322
[51]	valid_0's auc: 0.786486
[52]	valid_0's auc: 0.787117
[53]	valid_0's auc: 0.787146
[54]	valid_0's auc: 0.787302
[55]	valid_0's auc: 0.787534
[56]	valid_0's auc: 0.787693
[57]	valid_0's auc: 0.787819
[58]	valid_0's auc: 0.787985
[59]	valid_0's auc: 0.788091
[60]	valid_0's auc: 0.788141
[61]	valid_0's auc: 0.788182
[62]	valid_0's auc: 0.788354
[63]	valid_0's auc: 0.788629
[64]	valid_0's auc: 0.788776
[65]	valid_0's auc: 0.788969
[66]	valid_0's auc: 0.78913
[67]	valid_0's auc: 0.789357
[68]	valid_0's auc: 0.78942
[69]	valid_0's auc: 0.789737
[70]	valid_0's auc: 0.789753
[71]	valid_0's auc: 0.790198
[72]	valid_0's auc: 0.790309
[73]	valid_0's auc: 0.790432
[74]	valid_0's auc: 0.790811
[75]	valid_0's auc: 0.791035
[76]	valid_0's auc: 0.791106
[77]	valid_0's auc: 0.791114
[78]	valid_0's auc: 0.791274
[79]	valid_0's auc: 0.791496
[80]	valid_0's auc: 0.79162
[81]	valid_0's auc: 0.791681
[82]	valid_0's auc: 0.791771
[83]	valid_0's auc: 0.791911
[84]	valid_0's auc: 0.792032
[85]	valid_0's auc: 0.792057
[86]	valid_0's auc: 0.792311
[87]	valid_0's auc: 0.792332
[88]	valid_0's auc: 0.792474
[89]	valid_0's auc: 0.792526
[90]	valid_0's auc: 0.79275
[91]	valid_0's auc: 0.792853
[92]	valid_0's auc: 0.792966
[93]	valid_0's auc: 0.793104
[94]	valid_0's auc: 0.793066
[95]	valid_0's auc: 0.79324
[96]	valid_0's auc: 0.793248
[97]	valid_0's auc: 0.793377
[98]	valid_0's auc: 0.79337
[99]	valid_0's auc: 0.793458
[100]	valid_0's auc: 0.793566
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.793566
[1]	valid_0's auc: 0.762215
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.772303
[3]	valid_0's auc: 0.775269
[4]	valid_0's auc: 0.774358
[5]	valid_0's auc: 0.77402
[6]	valid_0's auc: 0.77701
[7]	valid_0's auc: 0.776882
[8]	valid_0's auc: 0.775852
[9]	valid_0's auc: 0.776006
[10]	valid_0's auc: 0.778096
[11]	valid_0's auc: 0.77758
[12]	valid_0's auc: 0.776888
[13]	valid_0's auc: 0.776609
[14]	valid_0's auc: 0.776707
[15]	valid_0's auc: 0.778561
[16]	valid_0's auc: 0.777955
[17]	valid_0's auc: 0.777753
[18]	valid_0's auc: 0.778763
[19]	valid_0's auc: 0.778335
[20]	valid_0's auc: 0.778246
[21]	valid_0's auc: 0.779333
[22]	valid_0's auc: 0.779276
[23]	valid_0's auc: 0.778949
[24]	valid_0's auc: 0.780416
[25]	valid_0's auc: 0.780641
[26]	valid_0's auc: 0.780962
[27]	valid_0's auc: 0.780591
[28]	valid_0's auc: 0.781559
[29]	valid_0's auc: 0.781823
[30]	valid_0's auc: 0.781878
[31]	valid_0's auc: 0.782692
[32]	valid_0's auc: 0.782631
[33]	valid_0's auc: 0.782614
[34]	valid_0's auc: 0.783374
[35]	valid_0's auc: 0.783385
[36]	valid_0's auc: 0.783939
[37]	valid_0's auc: 0.783825
[38]	valid_0's auc: 0.783982
[39]	valid_0's auc: 0.784469
[40]	valid_0's auc: 0.786002
[41]	valid_0's auc: 0.786036
[42]	valid_0's auc: 0.786187
[43]	valid_0's auc: 0.786532
[44]	valid_0's auc: 0.787004
[45]	valid_0's auc: 0.787682
[46]	valid_0's auc: 0.787881
[47]	valid_0's auc: 0.788149
[48]	valid_0's auc: 0.788099
[49]	valid_0's auc: 0.788289
[50]	valid_0's auc: 0.788382
[51]	valid_0's auc: 0.788686
[52]	valid_0's auc: 0.788752
[53]	valid_0's auc: 0.789079
[54]	valid_0's auc: 0.789559
[55]	valid_0's auc: 0.7896
[56]	valid_0's auc: 0.789684
[57]	valid_0's auc: 0.78988
[58]	valid_0's auc: 0.789933
[59]	valid_0's auc: 0.79029
[60]	valid_0's auc: 0.790306
[61]	valid_0's auc: 0.790442
[62]	valid_0's auc: 0.790457
[63]	valid_0's auc: 0.790812
[64]	valid_0's auc: 0.791001
[65]	valid_0's auc: 0.791055
[66]	valid_0's auc: 0.791488
[67]	valid_0's auc: 0.791523
[68]	valid_0's auc: 0.791666
[69]	valid_0's auc: 0.791733
[70]	valid_0's auc: 0.791853
[71]	valid_0's auc: 0.792079
[72]	valid_0's auc: 0.792187
[73]	valid_0's auc: 0.792463
[74]	valid_0's auc: 0.792603
[75]	valid_0's auc: 0.792743
[76]	valid_0's auc: 0.792738
[77]	valid_0's auc: 0.792841
[78]	valid_0's auc: 0.792955
[79]	valid_0's auc: 0.793102
[80]	valid_0's auc: 0.793241
[81]	valid_0's auc: 0.793332
[82]	valid_0's auc: 0.793376
[83]	valid_0's auc: 0.793485
[84]	valid_0's auc: 0.793612
[85]	valid_0's auc: 0.793715
[86]	valid_0's auc: 0.793893
[87]	valid_0's auc: 0.79398
[88]	valid_0's auc: 0.794051
[89]	valid_0's auc: 0.794168
[90]	valid_0's auc: 0.794333
[91]	valid_0's auc: 0.794413
[92]	valid_0's auc: 0.794751
[93]	valid_0's auc: 0.794833
[94]	valid_0's auc: 0.794992
[95]	valid_0's auc: 0.795076
[96]	valid_0's auc: 0.795215
[97]	valid_0's auc: 0.795345
[98]	valid_0's auc: 0.79546
[99]	valid_0's auc: 0.79568
[100]	valid_0's auc: 0.795711
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.795711
[1]	valid_0's auc: 0.760477
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.760477
[3]	valid_0's auc: 0.761049
[4]	valid_0's auc: 0.762803
[5]	valid_0's auc: 0.764426
[6]	valid_0's auc: 0.76481
[7]	valid_0's auc: 0.764804
[8]	valid_0's auc: 0.76486
[9]	valid_0's auc: 0.765372
[10]	valid_0's auc: 0.765314
[11]	valid_0's auc: 0.766307
[12]	valid_0's auc: 0.766468
[13]	valid_0's auc: 0.766473
[14]	valid_0's auc: 0.766878
[15]	valid_0's auc: 0.767154
[16]	valid_0's auc: 0.771358
[17]	valid_0's auc: 0.771997
[18]	valid_0's auc: 0.77235
[19]	valid_0's auc: 0.773067
[20]	valid_0's auc: 0.77316
[21]	valid_0's auc: 0.773603
[22]	valid_0's auc: 0.773982
[23]	valid_0's auc: 0.77437
[24]	valid_0's auc: 0.776197
[25]	valid_0's auc: 0.776418
[26]	valid_0's auc: 0.777278
[27]	valid_0's auc: 0.777414
[28]	valid_0's auc: 0.777641
[29]	valid_0's auc: 0.778541
[30]	valid_0's auc: 0.778702
[31]	valid_0's auc: 0.778776
[32]	valid_0's auc: 0.779424
[33]	valid_0's auc: 0.781895
[34]	valid_0's auc: 0.782156
[35]	valid_0's auc: 0.782471
[36]	valid_0's auc: 0.78238
[37]	valid_0's auc: 0.782442
[38]	valid_0's auc: 0.782723
[39]	valid_0's auc: 0.783611
[40]	valid_0's auc: 0.783936
[41]	valid_0's auc: 0.784154
[42]	valid_0's auc: 0.784687
[43]	valid_0's auc: 0.784856
[44]	valid_0's auc: 0.785017
[45]	valid_0's auc: 0.785474
[46]	valid_0's auc: 0.785745
[47]	valid_0's auc: 0.786119
[48]	valid_0's auc: 0.786212
[49]	valid_0's auc: 0.78609
[50]	valid_0's auc: 0.786079
[51]	valid_0's auc: 0.786433
[52]	valid_0's auc: 0.786534
[53]	valid_0's auc: 0.78668
[54]	valid_0's auc: 0.786942
[55]	valid_0's auc: 0.787322
[56]	valid_0's auc: 0.78739
[57]	valid_0's auc: 0.787454
[58]	valid_0's auc: 0.787588
[59]	valid_0's auc: 0.787963
[60]	valid_0's auc: 0.788005
[61]	valid_0's auc: 0.788147
[62]	valid_0's auc: 0.788331
[63]	valid_0's auc: 0.788597
[64]	valid_0's auc: 0.789037
[65]	valid_0's auc: 0.789185
[66]	valid_0's auc: 0.78975
[67]	valid_0's auc: 0.789884
[68]	valid_0's auc: 0.789872
[69]	valid_0's auc: 0.790233
[70]	valid_0's auc: 0.790472
[71]	valid_0's auc: 0.790604
[72]	valid_0's auc: 0.790855
[73]	valid_0's auc: 0.790919
[74]	valid_0's auc: 0.791129
[75]	valid_0's auc: 0.79132
[76]	valid_0's auc: 0.791519
[77]	valid_0's auc: 0.791642
[78]	valid_0's auc: 0.791805
[79]	valid_0's auc: 0.791894
[80]	valid_0's auc: 0.792156
[81]	valid_0's auc: 0.79229
[82]	valid_0's auc: 0.79234
[83]	valid_0's auc: 0.792528
[84]	valid_0's auc: 0.792846
[85]	valid_0's auc: 0.792962
[86]	valid_0's auc: 0.793151
[87]	valid_0's auc: 0.793373
[88]	valid_0's auc: 0.793494
[89]	valid_0's auc: 0.793592
[90]	valid_0's auc: 0.793746
[91]	valid_0's auc: 0.793984
[92]	valid_0's auc: 0.794137
[93]	valid_0's auc: 0.794296
[94]	valid_0's auc: 0.794592
[95]	valid_0's auc: 0.794654
[96]	valid_0's auc: 0.79484
[97]	valid_0's auc: 0.795148
[98]	valid_0's auc: 0.795326
[99]	valid_0's auc: 0.795606
[100]	valid_0's auc: 0.795675
Did not meet early stopping. Best iteration is:
[100]	valid_0's auc: 0.795675
[1]	valid_0's auc: 0.7495
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.7495
[3]	valid_0's auc: 0.7495
[4]	valid_0's auc: 0.7495
[5]	valid_0's auc: 0.7495
[6]	valid_0's auc: 0.7495
[7]	valid_0's auc: 0.7495
[8]	valid_0's auc: 0.7495
[9]	valid_0's auc: 0.7495
[10]	valid_0's auc: 0.7495
[11]	valid_0's auc: 0.7495
Early stopping, best iteration is:
[1]	valid_0's auc: 0.7495
[1]	valid_0's auc: 0.748453
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.748453
[3]	valid_0's auc: 0.748453
[4]	valid_0's auc: 0.748453
[5]	valid_0's auc: 0.748453
[6]	valid_0's auc: 0.748453
[7]	valid_0's auc: 0.748453
[8]	valid_0's auc: 0.748453
[9]	valid_0's auc: 0.748453
[10]	valid_0's auc: 0.748453
[11]	valid_0's auc: 0.748453
Early stopping, best iteration is:
[1]	valid_0's auc: 0.748453
[1]	valid_0's auc: 0.748867
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.748867
[3]	valid_0's auc: 0.748867
[4]	valid_0's auc: 0.748867
[5]	valid_0's auc: 0.748867
[6]	valid_0's auc: 0.748867
[7]	valid_0's auc: 0.748867
[8]	valid_0's auc: 0.748867
[9]	valid_0's auc: 0.748867
[10]	valid_0's auc: 0.748867
[11]	valid_0's auc: 0.748867
Early stopping, best iteration is:
[1]	valid_0's auc: 0.748867
[1]	valid_0's auc: 0.748286
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.748286
[3]	valid_0's auc: 0.748286
[4]	valid_0's auc: 0.748286
[5]	valid_0's auc: 0.748286
[6]	valid_0's auc: 0.748286
[7]	valid_0's auc: 0.748286
[8]	valid_0's auc: 0.748286
[9]	valid_0's auc: 0.748286
[10]	valid_0's auc: 0.748286
[11]	valid_0's auc: 0.748286
Early stopping, best iteration is:
[1]	valid_0's auc: 0.748286
[1]	valid_0's auc: 0.757043
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.757043
[3]	valid_0's auc: 0.757043
[4]	valid_0's auc: 0.757043
[5]	valid_0's auc: 0.757043
[6]	valid_0's auc: 0.757043
[7]	valid_0's auc: 0.757043
[8]	valid_0's auc: 0.757043
[9]	valid_0's auc: 0.757043
[10]	valid_0's auc: 0.757043
[11]	valid_0's auc: 0.757043
Early stopping, best iteration is:
[1]	valid_0's auc: 0.757043
[1]	valid_0's auc: 0.752298
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.752298
[3]	valid_0's auc: 0.752298
[4]	valid_0's auc: 0.752298
[5]	valid_0's auc: 0.752298
[6]	valid_0's auc: 0.752298
[7]	valid_0's auc: 0.752298
[8]	valid_0's auc: 0.752298
[9]	valid_0's auc: 0.752298
[10]	valid_0's auc: 0.752298
[11]	valid_0's auc: 0.752298
Early stopping, best iteration is:
[1]	valid_0's auc: 0.752298
[1]	valid_0's auc: 0.754636
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.754636
[3]	valid_0's auc: 0.754636
[4]	valid_0's auc: 0.754636
[5]	valid_0's auc: 0.754636
[6]	valid_0's auc: 0.754636
[7]	valid_0's auc: 0.754636
[8]	valid_0's auc: 0.754636
[9]	valid_0's auc: 0.754636
[10]	valid_0's auc: 0.754636
[11]	valid_0's auc: 0.754636
Early stopping, best iteration is:
[1]	valid_0's auc: 0.754636
[1]	valid_0's auc: 0.757698
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.757698
[3]	valid_0's auc: 0.757698
[4]	valid_0's auc: 0.757698
[5]	valid_0's auc: 0.757698
[6]	valid_0's auc: 0.757698
[7]	valid_0's auc: 0.757698
[8]	valid_0's auc: 0.757698
[9]	valid_0's auc: 0.757698
[10]	valid_0's auc: 0.757698
[11]	valid_0's auc: 0.757698
Early stopping, best iteration is:
[1]	valid_0's auc: 0.757698
[1]	valid_0's auc: 0.760154
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.760154
[3]	valid_0's auc: 0.760154
[4]	valid_0's auc: 0.760154
[5]	valid_0's auc: 0.760154
[6]	valid_0's auc: 0.760154
[7]	valid_0's auc: 0.760154
[8]	valid_0's auc: 0.760154
[9]	valid_0's auc: 0.760154
[10]	valid_0's auc: 0.760154
[11]	valid_0's auc: 0.760154
Early stopping, best iteration is:
[1]	valid_0's auc: 0.760154
[1]	valid_0's auc: 0.759786
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.759786
[3]	valid_0's auc: 0.759786
[4]	valid_0's auc: 0.759786
[5]	valid_0's auc: 0.759786
[6]	valid_0's auc: 0.759786
[7]	valid_0's auc: 0.759786
[8]	valid_0's auc: 0.759786
[9]	valid_0's auc: 0.759786
[10]	valid_0's auc: 0.759786
[11]	valid_0's auc: 0.759786
Early stopping, best iteration is:
[1]	valid_0's auc: 0.759786
[1]	valid_0's auc: 0.762215
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.762215
[3]	valid_0's auc: 0.762215
[4]	valid_0's auc: 0.762215
[5]	valid_0's auc: 0.762215
[6]	valid_0's auc: 0.762215
[7]	valid_0's auc: 0.762215
[8]	valid_0's auc: 0.762215
[9]	valid_0's auc: 0.762215
[10]	valid_0's auc: 0.762215
[11]	valid_0's auc: 0.762215
Early stopping, best iteration is:
[1]	valid_0's auc: 0.762215
[1]	valid_0's auc: 0.760477
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.760477
[3]	valid_0's auc: 0.760477
[4]	valid_0's auc: 0.760477
[5]	valid_0's auc: 0.760477
[6]	valid_0's auc: 0.760477
[7]	valid_0's auc: 0.760477
[8]	valid_0's auc: 0.760477
[9]	valid_0's auc: 0.760477
[10]	valid_0's auc: 0.760477
[11]	valid_0's auc: 0.760477
Early stopping, best iteration is:
[1]	valid_0's auc: 0.760477
[1]	valid_0's auc: 0.7495
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.7495
[3]	valid_0's auc: 0.7495
[4]	valid_0's auc: 0.7495
[5]	valid_0's auc: 0.7495
[6]	valid_0's auc: 0.7495
[7]	valid_0's auc: 0.7495
[8]	valid_0's auc: 0.7495
[9]	valid_0's auc: 0.7495
[10]	valid_0's auc: 0.7495
[11]	valid_0's auc: 0.7495
Early stopping, best iteration is:
[1]	valid_0's auc: 0.7495
[1]	valid_0's auc: 0.748453
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.748453
[3]	valid_0's auc: 0.748453
[4]	valid_0's auc: 0.748453
[5]	valid_0's auc: 0.748453
[6]	valid_0's auc: 0.748453
[7]	valid_0's auc: 0.748453
[8]	valid_0's auc: 0.748453
[9]	valid_0's auc: 0.748453
[10]	valid_0's auc: 0.748453
[11]	valid_0's auc: 0.748453
Early stopping, best iteration is:
[1]	valid_0's auc: 0.748453
[1]	valid_0's auc: 0.748867
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.748867
[3]	valid_0's auc: 0.748867
[4]	valid_0's auc: 0.748867
[5]	valid_0's auc: 0.748867
[6]	valid_0's auc: 0.748867
[7]	valid_0's auc: 0.748867
[8]	valid_0's auc: 0.748867
[9]	valid_0's auc: 0.748867
[10]	valid_0's auc: 0.748867
[11]	valid_0's auc: 0.748867
Early stopping, best iteration is:
[1]	valid_0's auc: 0.748867
[1]	valid_0's auc: 0.748286
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.748286
[3]	valid_0's auc: 0.748286
[4]	valid_0's auc: 0.748286
[5]	valid_0's auc: 0.748286
[6]	valid_0's auc: 0.748286
[7]	valid_0's auc: 0.748286
[8]	valid_0's auc: 0.748286
[9]	valid_0's auc: 0.748286
[10]	valid_0's auc: 0.748286
[11]	valid_0's auc: 0.748286
Early stopping, best iteration is:
[1]	valid_0's auc: 0.748286
[1]	valid_0's auc: 0.757043
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.757043
[3]	valid_0's auc: 0.757043
[4]	valid_0's auc: 0.757043
[5]	valid_0's auc: 0.757043
[6]	valid_0's auc: 0.757043
[7]	valid_0's auc: 0.757043
[8]	valid_0's auc: 0.757043
[9]	valid_0's auc: 0.757043
[10]	valid_0's auc: 0.757043
[11]	valid_0's auc: 0.757043
Early stopping, best iteration is:
[1]	valid_0's auc: 0.757043
[1]	valid_0's auc: 0.752298
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.752298
[3]	valid_0's auc: 0.752298
[4]	valid_0's auc: 0.752298
[5]	valid_0's auc: 0.752298
[6]	valid_0's auc: 0.752298
[7]	valid_0's auc: 0.752298
[8]	valid_0's auc: 0.752298
[9]	valid_0's auc: 0.752298
[10]	valid_0's auc: 0.752298
[11]	valid_0's auc: 0.752298
Early stopping, best iteration is:
[1]	valid_0's auc: 0.752298
[1]	valid_0's auc: 0.754636
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.754636
[3]	valid_0's auc: 0.754636
[4]	valid_0's auc: 0.754636
[5]	valid_0's auc: 0.754636
[6]	valid_0's auc: 0.754636
[7]	valid_0's auc: 0.754636
[8]	valid_0's auc: 0.754636
[9]	valid_0's auc: 0.754636
[10]	valid_0's auc: 0.754636
[11]	valid_0's auc: 0.754636
Early stopping, best iteration is:
[1]	valid_0's auc: 0.754636
[1]	valid_0's auc: 0.757698
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.757698
[3]	valid_0's auc: 0.757698
[4]	valid_0's auc: 0.757698
[5]	valid_0's auc: 0.757698
[6]	valid_0's auc: 0.757698
[7]	valid_0's auc: 0.757698
[8]	valid_0's auc: 0.757698
[9]	valid_0's auc: 0.757698
[10]	valid_0's auc: 0.757698
[11]	valid_0's auc: 0.757698
Early stopping, best iteration is:
[1]	valid_0's auc: 0.757698
[1]	valid_0's auc: 0.760154
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.760154
[3]	valid_0's auc: 0.760154
[4]	valid_0's auc: 0.760154
[5]	valid_0's auc: 0.760154
[6]	valid_0's auc: 0.760154
[7]	valid_0's auc: 0.760154
[8]	valid_0's auc: 0.760154
[9]	valid_0's auc: 0.760154
[10]	valid_0's auc: 0.760154
[11]	valid_0's auc: 0.760154
Early stopping, best iteration is:
[1]	valid_0's auc: 0.760154
[1]	valid_0's auc: 0.759786
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.759786
[3]	valid_0's auc: 0.759786
[4]	valid_0's auc: 0.759786
[5]	valid_0's auc: 0.759786
[6]	valid_0's auc: 0.759786
[7]	valid_0's auc: 0.759786
[8]	valid_0's auc: 0.759786
[9]	valid_0's auc: 0.759786
[10]	valid_0's auc: 0.759786
[11]	valid_0's auc: 0.759786
Early stopping, best iteration is:
[1]	valid_0's auc: 0.759786
[1]	valid_0's auc: 0.762215
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.762215
[3]	valid_0's auc: 0.762215
[4]	valid_0's auc: 0.762215
[5]	valid_0's auc: 0.762215
[6]	valid_0's auc: 0.762215
[7]	valid_0's auc: 0.762215
[8]	valid_0's auc: 0.762215
[9]	valid_0's auc: 0.762215
[10]	valid_0's auc: 0.762215
[11]	valid_0's auc: 0.762215
Early stopping, best iteration is:
[1]	valid_0's auc: 0.762215
[1]	valid_0's auc: 0.760477
Training until validation scores don't improve for 10 rounds
[2]	valid_0's auc: 0.760477
[3]	valid_0's auc: 0.760477
[4]	valid_0's auc: 0.760477
[5]	valid_0's auc: 0.760477
[6]	valid_0's auc: 0.760477
[7]	valid_0's auc: 0.760477
[8]	valid_0's auc: 0.760477
[9]	valid_0's auc: 0.760477
[10]	valid_0's auc: 0.760477
[11]	valid_0's auc: 0.760477
Early stopping, best iteration is:
[1]	valid_0's auc: 0.760477
[1]	valid_0's auc: 0.762923
[2]	valid_0's auc: 0.762923
[3]	valid_0's auc: 0.762982
[4]	valid_0's auc: 0.76328
[5]	valid_0's auc: 0.76328
[6]	valid_0's auc: 0.76338
[7]	valid_0's auc: 0.76338
[8]	valid_0's auc: 0.763386
[9]	valid_0's auc: 0.76536
[10]	valid_0's auc: 0.767171
[11]	valid_0's auc: 0.767107
[12]	valid_0's auc: 0.767159
[13]	valid_0's auc: 0.772416
[14]	valid_0's auc: 0.772331
[15]	valid_0's auc: 0.77242
[16]	valid_0's auc: 0.772598
[17]	valid_0's auc: 0.773041
[18]	valid_0's auc: 0.773071
[19]	valid_0's auc: 0.772971
[20]	valid_0's auc: 0.772972
[21]	valid_0's auc: 0.772936
[22]	valid_0's auc: 0.772894
[23]	valid_0's auc: 0.772985
[24]	valid_0's auc: 0.773115
[25]	valid_0's auc: 0.773491
[26]	valid_0's auc: 0.773984
[27]	valid_0's auc: 0.775296
[28]	valid_0's auc: 0.775961
[29]	valid_0's auc: 0.776052
[30]	valid_0's auc: 0.775913
[31]	valid_0's auc: 0.776688
[32]	valid_0's auc: 0.776406
[33]	valid_0's auc: 0.776428
[34]	valid_0's auc: 0.776256
[35]	valid_0's auc: 0.776261
[36]	valid_0's auc: 0.776724
[37]	valid_0's auc: 0.776823
[38]	valid_0's auc: 0.777754
[39]	valid_0's auc: 0.777335
[40]	valid_0's auc: 0.777309
[41]	valid_0's auc: 0.777309
[42]	valid_0's auc: 0.777453
[43]	valid_0's auc: 0.778095
[44]	valid_0's auc: 0.778059
[45]	valid_0's auc: 0.777958
[46]	valid_0's auc: 0.778428
[47]	valid_0's auc: 0.77833
[48]	valid_0's auc: 0.778252
[49]	valid_0's auc: 0.778658
[50]	valid_0's auc: 0.778503
[51]	valid_0's auc: 0.779384
[52]	valid_0's auc: 0.779283
[53]	valid_0's auc: 0.779404
[54]	valid_0's auc: 0.779851
[55]	valid_0's auc: 0.779849
[56]	valid_0's auc: 0.780255
[57]	valid_0's auc: 0.780341
[58]	valid_0's auc: 0.780195
[59]	valid_0's auc: 0.780144
[60]	valid_0's auc: 0.779917
[61]	valid_0's auc: 0.779942
[62]	valid_0's auc: 0.780441
[63]	valid_0's auc: 0.780375
[64]	valid_0's auc: 0.780362
[65]	valid_0's auc: 0.781032
[66]	valid_0's auc: 0.780952
[67]	valid_0's auc: 0.780989
[68]	valid_0's auc: 0.781004
[69]	valid_0's auc: 0.781033
[70]	valid_0's auc: 0.781128
[71]	valid_0's auc: 0.781141
[72]	valid_0's auc: 0.780912
[73]	valid_0's auc: 0.780887
[74]	valid_0's auc: 0.780896
[75]	valid_0's auc: 0.780868
[76]	valid_0's auc: 0.780729
[77]	valid_0's auc: 0.78114
[78]	valid_0's auc: 0.781278
[79]	valid_0's auc: 0.781385
[80]	valid_0's auc: 0.781349
[81]	valid_0's auc: 0.781434
[82]	valid_0's auc: 0.781471
[83]	valid_0's auc: 0.781478
[84]	valid_0's auc: 0.781458
[85]	valid_0's auc: 0.782029
[86]	valid_0's auc: 0.781957
[87]	valid_0's auc: 0.781963
[88]	valid_0's auc: 0.781996
[89]	valid_0's auc: 0.781997
[90]	valid_0's auc: 0.781915
[91]	valid_0's auc: 0.781959
[92]	valid_0's auc: 0.781939
[93]	valid_0's auc: 0.782183
[94]	valid_0's auc: 0.782168
[95]	valid_0's auc: 0.782084
[96]	valid_0's auc: 0.782157
[97]	valid_0's auc: 0.782213
[98]	valid_0's auc: 0.78221
[99]	valid_0's auc: 0.782194
[100]	valid_0's auc: 0.782236
Out[97]:
GridSearchCV(cv=4, error_score=nan,
             estimator=LGBMClassifier(boosting_type='gbdt', class_weight=None,
                                      colsample_bytree=1.0,
                                      importance_type='split',
                                      learning_rate=0.1, max_depth=-1,
                                      metric='auc', min_child_samples=20,
                                      min_child_weight=0.001,
                                      min_split_gain=0.0, n_estimators=100,
                                      n_jobs=-1, num_iterations=100,
                                      num_leaves=31, objective='binary',
                                      random_state=None, reg...ha=0.0,
                                      reg_lambda=0.0, silent=True,
                                      subsample=1.0, subsample_for_bin=200000,
                                      subsample_freq=0),
             iid='deprecated', n_jobs=None,
             param_grid={'boosting_type': ['dart', 'gbdt'],
                         'learning_rate': [0.005, 0.01, 0.0001],
                         'n_estimators': [3, 5], 'num_leaves': [12, 16, 20],
                         'random_state': [501]},
             pre_dispatch='2*n_jobs', refit=True, return_train_score=False,
             scoring=None, verbose=0)
In [98]:
lgb_grid_mdl.best_params_
Out[98]:
{'boosting_type': 'dart',
 'learning_rate': 0.01,
 'n_estimators': 3,
 'num_leaves': 20,
 'random_state': 501}
In [99]:
train_probs_lgb = lgb_grid_mdl.predict_proba(x_train_rf)
test_probs_lgb = lgb_grid_mdl.predict_proba(x_test)

train_probs_df_lgb = pd.DataFrame(train_probs_lgb)
test_probs_df_lgb = pd.DataFrame(test_probs_lgb)

train_probs_df_lgb.columns = ['trainprobs' + str(col) for col in train_probs_df_lgb.columns]
test_probs_df_lgb.columns = ['testprobs' + str(col) for col in test_probs_df_lgb.columns]
In [100]:
fpr, tpr, thresholds = metrics.roc_curve(y_train_rf, train_probs_lgb[:,1])
metrics.auc(fpr, tpr)
Out[100]:
0.7924184238583627
In [101]:
fpr, tpr, thresholds = metrics.roc_curve(y_test, test_probs_lgb[:,1])
metrics.auc(fpr, tpr)
Out[101]:
0.78052522159409

This model has clearly less accuracy.

Finally, by using LightGBM (with manual parameter tuning), we get the Train AUC as 0.87 and Validation AUC as 0.84.

The difference of Train and Validation accuracy is approximately 0.03 and we can say that the model is also fit-fine.

Hence, among the three chosen Gradient Boosted Decision Trees, LightGBM gives the best accuracy on AUC and that too with a difference of 0.03, which makes the model ‘fit-fine’.

However, both XGBoost and CatBoost have lower train-test accuracy difference and are fit-fine, we decided to choose LightGBM as it has highest AUC.

Exporting data to csv file

In [102]:
# Reading in test data file
lab_test_df = pd.read_csv('PYTHON_LAB_DF_TEST_2.csv', sep = ",")
python_lab_test_df = lab_test_df.copy()
In [103]:
# Label Encoding
le = preprocessing.LabelEncoder()
le.fit(python_lab_test_df['SCHED_SURG_AREA'])
python_lab_test_df['SCHED_SURG_AREA'] = le.transform(python_lab_test_df['SCHED_SURG_AREA'])
python_lab_test_df['RACE'] = le.fit_transform(python_lab_test_df['RACE'].apply(str))
python_lab_test_df['ETHNIC_GROUP'] = le.fit_transform(python_lab_test_df['ETHNIC_GROUP'].apply(str))
python_lab_test_df['SCHED_HOSPITAL'] = le.fit_transform(python_lab_test_df['SCHED_HOSPITAL'].apply(str))
python_lab_test_df['SCHED_SURG_PROC_CD'] = le.fit_transform(python_lab_test_df['SCHED_SURG_PROC_CD'].apply(str))
python_lab_test_df['FEMALE'] = le.fit_transform(python_lab_test_df['FEMALE'].apply(str))
python_lab_test_df['CAV_REC_SEX'] = le.fit_transform(python_lab_test_df['CAV_REC_SEX'].apply(str))
python_lab_test_df['CAV_REC_LANG'] = le.fit_transform(python_lab_test_df['CAV_REC_LANG'].apply(str))
python_lab_test_df['CAV_REC_IPOP'] = le.fit_transform(python_lab_test_df['CAV_REC_IPOP'].apply(str))
python_lab_test_df['CAV_REC_PRIORITY_CODE'] = le.fit_transform(python_lab_test_df['CAV_REC_PRIORITY_CODE'].apply(str))
python_lab_test_df['CAV_REC_DISP_CODE'] = le.fit_transform(python_lab_test_df['CAV_REC_DISP_CODE'].apply(str))
In [104]:
python_lab_test_df = python_lab_test_df.drop(columns = ['PROC_DATE', 'CREATE_DT_TM', 'SCHED_START_DT_TM'])
In [105]:
predict_lab_test = gbm_rh.predict(python_lab_test_df.drop(columns = ['ID1']))
In [106]:
predict_lab_test
Out[106]:
array([0.7274571 , 0.68008264, 0.79247017, ..., 0.74397117, 0.84123845,
       0.38515668])
In [107]:
predict_lab_test_df = python_lab_test_df[['ID1']].copy()
predict_lab_test_df['LOS_PROB'] = predict_lab_test
In [108]:
predict_lab_test_df.head()
Out[108]:
ID1 LOS_PROB
0 1 0.727457
1 2 0.680083
2 3 0.792470
3 4 0.180654
4 5 0.131683
In [109]:
# Deploying to a csv file.
predict_lab_test_df.to_csv("PYTHON_LAB_TEST_PROBABILITY.csv", sep=',', index=False)

Checking variable/feature importance

In [110]:
# Importing SHAP and LIME
import shap
import lime
In [111]:
import lime.lime_tabular
In [112]:
gbm_features = list(x_train.columns)
In [113]:
print(gbm_features)
['SCHED_SURG_AREA', 'RACE', 'ETHNIC_GROUP', 'SCHED_HOSPITAL', 'SCHED_SURG_PROC_CD', 'FEMALE', 'AGE_ON_CONTACT_DATE', 'BMI', 'WEIGHT', 'BP_SYSTOLIC', 'BP_DIASTOLIC', 'PULSE', 'PCPVISIT', 'METFORMIN_FLAG', 'OPIOIDS_FLAG', 'ALPHA_BLOCKERS', 'CENTRAL_ANTAGONISTS', 'RENIN', 'BETA_BLOCKERS', 'ACE_INHIB', 'ARB', 'ALDOSTERONE_BLOCKERS', 'VASODIALATORS', 'DIURETICS', 'CALCIUM_BLOCKERS', 'STATINS', 'INSULIN_MEDS', 'ASPIRIN', 'WARFARIN', 'DOACS', 'PRETERM_17P', 'MEDROL', 'PREDNISONE', 'INHALED_STEROID_WITH_LABA', 'INHALED_STEROID_WITHOUT_LABA', 'INHALED_STEROIDS', 'ASTHMA_BIOLOGICS', 'SHORT_ACTING_BRONCHO_DIALATORS', 'TNF_INHIBITORS', 'IMMUNOMODULATORS', 'AMINOSALICYLATES', 'CORTICOSTEROIDS', 'ARNI', 'ALLOPURINOL', 'SEIZURE', 'MUSCLERELAXANT', 'DIGOXIN', 'INOTROPES', 'ANTI_ARRHYTHMIC', 'ANTIPLATELET', 'SULFONYLUREA', 'GLP_1_AGONIST', 'THIAZOLIDINEDIONE', 'SGLT2_INHIBITOR', 'DPP4_INHIBITOR', 'ALPHA_GLUCOSIDASE_INHIBITOR', 'AMYLINOMIMETIC', 'RAPID_ACTING_INSULIN', 'SHORT_ACTING_INSULIN', 'INTERMEDIATE_ACTING_INSULIN', 'LONG_ACTING_INSULIN', 'MINOCYCLINE', 'DOXYCYCLINE', 'MELATONIN', 'METHAZOLAMIDE', 'HYDROXYCHLOROQUINE', 'ITTC', 'DMARDS', 'OBESE_HST', 'MORBIDOBESE_HST', 'PH_HST', 'AFIB_HST', 'COPD_HST', 'CHF_HST', 'DIAB_HST', 'CAD_HST', 'OSTEO_HST', 'HTN_HST', 'CANCER_HST', 'LUNG_CANCER_HST', 'OVARIAN_CANCER_HST', 'HEAD_NECK_CANCER_HST', 'BREAST_CANCER_HST', 'ASTHMA_HST', 'GERD_HST', 'FIBROMYALGIA_HST', 'DEPRESSION_HST', 'PSORIATIC_ARTHRITIS_HST', 'RHEUM_ARTHRITIS_HST', 'LUPUS_HST', 'VTVF_HST', 'STROKE_HST', 'VASCULARDISEASE_HST', 'LOWBACKPAIN_HST', 'DVT_HST', 'PE_HST', 'HYPOTHYROIDISM_HST', 'ADRENAL_INSUFFICIENCY_HST', 'INFERTILITY_HST', 'CKD_HST', 'ESRD_HST', 'OBS_SLEEPAPNEA_HST', 'CARDIAC_ARREST_HST', 'HEMO_STROKE_HST', 'MAJOR_BLEED_HST', 'MACULAR_DEGEN_HST', 'ANXIETY_HST', 'HYPERLIPIDEMIA_HST', 'HIV_HST', 'ALZHEIMER_HST', 'COLORECTAL_CANCER_HST', 'ENDOMETRIAL_CANCER_HST', 'GLAUCOMA_HST', 'HIP_PELVIC_FRACTURE_HST', 'BENIGN_PROSTATIC_HYPERPLASIA_HST', 'CIRRHOSIS_HST', 'CIRRHOSIS_HST_1', 'CHOLESTEROL_CLOSEST', 'HDL_CLOSEST', 'LDL_CLOSEST', 'TRIG_CLOSEST', 'WBC_CLOSEST', 'HGB_CLOSEST', 'URIC_ACID_CLOSEST', 'HCO3_CLOSEST', 'SODIUM_CLOSEST', 'CREATININE_CLOSEST', 'EF_CLOSEST', 'FEV1_CLOSEST', 'EOS_CLOSEST', 'NEUTRO_CLOSEST', 'MONO_CLOSEST', 'BASOPHIL_CLOSEST', 'K_CLOSEST', 'EGFR_CLOSEST', 'TSH_CLOSEST', 'T4_CLOSEST', 'GLUCOSE_CLOSEST', 'HBA1C_CLOSEST', 'ESR_CLOSEST', 'VITAMIN_D_CLOSEST', 'MAGNESIUM_CLOSEST', 'FOLICAC_CLOSEST', 'VIT_B12_CLOSEST', 'BNP_CLOSEST', 'PLATELET_CLOSEST', 'PA_PRESSURE_CLOSEST', 'HEMATOCRIT_CLOSEST', 'ALBUMIN_CLOSEST', 'PREALBUMIN_CLOSEST', 'MR_CLOSEST', 'TR_CLOSEST', 'MEANPLATELETVOL_CLOSEST', 'MCH_CLOSEST', 'RDW_CLOSEST', 'MCV_CLOSEST', 'MCHC_CLOSEST', 'RBC_CLOSEST', 'LYMPHOCYTE_CLOSEST', 'CA125_CLOSEST', 'BILIRUBIN_CLOSEST', 'ALT_CLOSEST', 'AST_CLOSEST', 'CA_CLOSEST', 'PHOSPHORUS_CLOSEST', 'URINEPROTEIN_CLOSEST', 'TOTALPREVIOUSHOSPVISITS', 'TOTALPREVIOUSEDVISITS', 'TOTALPREVIOUSPCPVISITS', 'PREVIOUSSPECIALTYVISIT', 'PREVIOUSURGENTCAREVISIT', 'CAV_REC_SEX', 'CAV_REC_LANG', 'CAV_REC_AGE', 'CAV_REC_IPOP', 'CAV_REC_PRIORITY_CODE', 'CAV_REC_DISP_CODE', 'UREA_NITROGEN_MAX_1', 'UREA_NITROGEN_MIN_1', 'CALCIUM_MAX_1', 'CALCIUM_MIN_1', 'IRON_MAX_1', 'IRON_MIN_1', 'GLUCOSE_MAX_1', 'GLUCOSE_MIN_1', 'HGB_MAX_1', 'HGB_MIN_1', 'HEMATOCRIT_MAX_1', 'HEMATOCRIT_MIN_1', 'CHLORIDE_MAX_1', 'CHLORIDE_MIN_1', 'SODIUM_MAX_1', 'SODIUM_MIN_1', 'CREATININE_MAX_1', 'CREATININE_MIN_1', 'CARBON_DIOXIDE_MAX_1', 'CARBON_DIOXIDE_MIN_1', 'RBC_MAX_1', 'RBC_MIN_1', 'MCV_MAX_1', 'MCV_MIN_1', 'MCH_MAX_1', 'MCH_MIN_1', 'MCHC_MAX_1', 'MCHC_MIN_1', 'ANION_GAP_MAX_1', 'ANION_GAP_MIN_1', 'PLATELETS_MAX_1', 'PLATELETS_MIN_1', 'WBC_MAX_1', 'WBC_MIN_1', 'MEAN_PLATELET_VOLUME_MAX_1', 'MEAN_PLATELET_VOLUME_MIN_1', 'EGFR_MAX_1', 'EGFR_MIN_1', 'RDW_MAX_1', 'RDW_MIN_1', 'BASOPHILS_MAX_1', 'BASOPHILS_MIN_1', 'NEUTROPHILS_MAX_1', 'NEUTROPHILS_MIN_1', 'LYMPHOCYTES_MAX_1', 'LYMPHOCYTES_MIN_1', 'MONOCYTES_MAX_1', 'MONOCYTES_MIN_1', 'EOSINOPHILS_MAX_1', 'EOSINOPHILS_MIN_1', 'MAGNESIUM_MAX_1', 'MAGNESIUM_MIN_1', 'PHOSPHORUS_MAX_1', 'PHOSPHORUS_MIN_1', 'INR_MAX_1', 'INR_MIN_1', 'ALBUMIN_MAX_1', 'ALBUMIN_MIN_1', 'TOTAL_BILIRUBIN_MAX_1', 'TOTAL_BILIRUBIN_MIN_1', 'AST_MAX_1', 'AST_MIN_1', 'ALT_MAX_1', 'ALT_MIN_1', 'ALKALINE_PHOSPHATASE_MAX_1', 'ALKALINE_PHOSPHATASE_MIN_1', 'TOTAL_PROTEIN_MAX_1', 'TOTAL_PROTEIN_MIN_1', 'BUN_CREATININE_RATIO_MAX_1', 'ACTIVATED_PTT_MAX_1', 'BUN_CREATININE_RATIO_MIN_1', 'ACTIVATED_PTT_MIN_1', 'TROPONIN_I_MAX_1', 'TROPONIN_I_MIN_1', 'SPECIFIC_GRAVITY_URINE_MAX_1', 'SPECIFIC_GRAVITY_URINE_MIN_1', 'PROTEIN_URINE_MAX_1', 'PROTEIN_URINE_MIN_1', 'PH_URINE_MAX_1', 'PH_URINE_MIN_1', 'KETONES_URINE_MAX_1', 'KETONES_URINE_MIN_1', 'URINE_NITRITE_MAX_1', 'URINE_NITRITE_MIN_1', 'LEUKOCYTE_ESTERASE_MAX_1', 'LEUKOCYTE_ESTERASE_MIN_1', 'BLOOD_URINE_MAX_1', 'BLOOD_URINE_MIN_1', 'BILIRUBIN_URINE_MAX_1', 'BILIRUBIN_URINE_MIN_1', 'UROBILINOGEN_URINE_MAX_1', 'UROBILINOGEN_URINE_MIN_1', 'WHITE_BLOOD_CELLS_URINE_MAX_1', 'WHITE_BLOOD_CELLS_URINE_MIN_1', 'RED_BLOOD_CELLS_URINE_MAX_1', 'RED_BLOOD_CELLS_URINE_MIN_1', 'CALCULATED_OSMOLALITY_MAX_1', 'CALCULATED_OSMOLALITY_MIN_1', 'DIRECT_BILIRUBIN_MAX_1', 'DIRECT_BILIRUBIN_MIN_1', 'LACTATE_BLOOD_MAX_1', 'LACTATE_BLOOD_MIN_1', 'BACTERIA_MAX_1', 'BACTERIA_MIN_1', 'EPITHELIAL_CELLS_MAX_1', 'EPITHELIAL_CELLS_MIN_1', 'AG_RATIO_MAX_1', 'AG_RATIO_MIN_1', 'PCO2_ARTERIAL_MAX_1', 'PCO2_ARTERIAL_MIN_1', 'ADI_2015']
In [114]:
print(len(gbm_features))
288
In [115]:
# LIME does not take pandas dataframes as inputs.  So, we must change the dataframe (x_train) to an array
gbm_exp_lime = lime.lime_tabular.LimeTabularExplainer(np.nan_to_num(x_train.values), feature_names = x_train.columns.values.tolist(),
                                                     mode = 'classification', discretize_continuous=False)
In [116]:
from lightgbm import LGBMClassifier
gbm_clf = lgb.LGBMClassifier(boosting_type = 'gbdt',
                             num_leaves = 201,
                             #max_depth = ,
                             learning_rate = 0.04
                             #n_estimators = 
                             #,subsample_for_bin =
                             ,objective = 'binary'
                             ,metric = 'auc'
                             #,class_weight = 
                             #,min_split_gain =
                             #,min_split_weight =
                             ,min_child_weight = 700
                             #,min_child_samples =
                             ,subsample = 0.65
                             #,subsample_freq =
                             #,colsample_bytree =
                             ,reg_alpha = 73
                             ,reg_lambda = 958
                             ,importance_type = 'split' #will rank features by # of times it is used in model.'gain' for gain
                             ,num_iterations = 1557
                       )
In [117]:
# LIME cannot handle NaN values
#gbm_exp_i = gbm_exp_lime.explain_instance(np.nan_to_num(x_test.values[100]), gbm_clf.predict_proba, num_features = 288)

SHAP

“SHAP (SHapley Additive exPlanations) is used to explain the prediction of an instance x by computing the contribution of each feature to the prediction. The SHAP explanation method computes Shapley values from coalitional game theory. The feature values of a data instance act as players in a coalition. Shapley values tell us how to fairly distribute the “payout” (= the prediction) among the features.”

In [118]:
xgb_shap_explainer = shap.TreeExplainer(gbm_rh)
Setting feature_perturbation = "tree_path_dependent" because no background data was given.
In [119]:
xgb_shap_vals_train = xgb_shap_explainer.shap_values(x_train)
LightGBM binary classifier with TreeExplainer shap values output has changed to a list of ndarray
In [120]:
xgb_shap_vals_test = xgb_shap_explainer.shap_values(x_test)
In [121]:
# Variable importance summaries for SHAP - train data
shap.summary_plot(xgb_shap_vals_train, x_train)
In [122]:
# Variable importance summaries for SHAP - test data
shap.summary_plot(xgb_shap_vals_test, x_test)
In [123]:
xgb_shap_vals_train = np.array(xgb_shap_vals_train)
In [124]:
shap.initjs()
shap.force_plot(xgb_shap_explainer.expected_value[0], xgb_shap_vals_train[0][0,:], x_train.iloc[0,:])
Out[124]:
Visualization omitted, Javascript library not loaded!
Have you run `initjs()` in this notebook? If this notebook was from another user you must also trust this notebook (File -> Trust notebook). If you are viewing this notebook on github the Javascript has been stripped for security. If you are using JupyterLab this error is because a JupyterLab extension has not yet been written.
In [125]:
xgb_shap_vals_train.shape
Out[125]:
(2, 51200, 288)
In [126]:
import random
patient_number = 101
# patient_number = random.randint(0, xgb_shap_vals_train.shape[1]-1)
patient_number #101
Out[126]:
101
In [127]:
shap.initjs()
shap.force_plot(xgb_shap_explainer.expected_value[0], xgb_shap_vals_train[0][patient_number,:], x_train.iloc[patient_number,:])
Out[127]:
Visualization omitted, Javascript library not loaded!
Have you run `initjs()` in this notebook? If this notebook was from another user you must also trust this notebook (File -> Trust notebook). If you are viewing this notebook on github the Javascript has been stripped for security. If you are using JupyterLab this error is because a JupyterLab extension has not yet been written.

As can be seen in the force plot, SCHED_SURG_PROC_CD put higher chance of LOS. For both Patient 0 and Patient 101. Though Patient 0 has less chance of LOS > 5, but Patient 101 has very high predicted risk of 1.98. For Patient 101 the risk increased by the decrement of SCHED_SURG_PROC_CD, as it is almost half of what Patient 0 has. So, it can be seen LOS risk is inversely proportional to SCHED_SURG_PROC_CD. It can be seen that for Patient 5010 SCHED_SURG_PROC_CD infact in the lower risk side of the explainer.

In [128]:
shap.initjs()
shap.force_plot(xgb_shap_explainer.expected_value[0], xgb_shap_vals_train[0][5010,:], x_train.iloc[5010,:])
Out[128]:
Visualization omitted, Javascript library not loaded!
Have you run `initjs()` in this notebook? If this notebook was from another user you must also trust this notebook (File -> Trust notebook). If you are viewing this notebook on github the Javascript has been stripped for security. If you are using JupyterLab this error is because a JupyterLab extension has not yet been written.
In [129]:
shap.force_plot(xgb_shap_explainer.expected_value[1], xgb_shap_vals_train[1][:500,:], x_train.iloc[:500,:])
Out[129]:
Visualization omitted, Javascript library not loaded!
Have you run `initjs()` in this notebook? If this notebook was from another user you must also trust this notebook (File -> Trust notebook). If you are viewing this notebook on github the Javascript has been stripped for security. If you are using JupyterLab this error is because a JupyterLab extension has not yet been written.
In [130]:
shap.dependence_plot("SCHED_SURG_PROC_CD", xgb_shap_vals_train[1], x_train, display_features=x_train)
In [131]:
shap.dependence_plot("SCHED_SURG_PROC_CD", xgb_shap_vals_train[0], x_train, display_features=x_train)
In [ ]:
 

Python Lab Update (10 points)

Questions & Answers

If you did not use a gradient boosted decision tree or random forest for the Python Lab to predict length of stay, you will have to create a GBDT or random forest for this assignment. Turn in your code in either .pdf or .html format for this assignment. Answers to the questions can be included in the code.

1. Take your best model and re-tune it using hyperopt or hyperband. Retrain your model with the best parameters. After retraining, answer the following questions:

  • What is your AUC on training and testing? (2 points)

  • Is your model overfit, underfit, or fit well? (2 points)

  • If your model is overfit or underfit, retune your model to improve its performance (manually or with gridsearch). Describe the approach you took to improve your model (1 point).

  • If you already used hyperband or hyperopt, please discuss your tuning with these methods and resubmit the same code. You can additionally try a different hyperparameter tuning method if you’d like.

Solution:

LightGBM is out best model.

We have re-tuned this model using hyperband. Also, we have retrained the model with the best parameters.

After retraining and retuning,
AUC on training: 0.967
AUC on testing: 0.839
Hence, there is a huge overfit.

We have trained the model using hyperopt also.
For this, Train AUC: 0.990758 and Test AUC: 0.830068
Here, the overfitting even greater than what we have got using hyperband.

We have tarined the model using GridSearch as well.
Here, Train AUC: 0.7924184238583627 and Test AUC: 0.78052522159409.
The model seems to be fit-fine here (as the Train-Test AUC difference is within 0.03). However, we get more accuracy when the hyperparameters are chosen manually.

Hence, we choose the model (which is fit-fine) when the hyperparameters are tuned manually.

2. Take your best model and find your variable importance using LIME or SHAP. Plot your variable importance summaries for LIME or SHAP on both your training and testing data (2 points). Then answer the following questions:

  • What are your most important variables? (1 point)

  • How do they affect the model as a whole? Or How do they affect a certain observation? (1 point)

Solution:

LightGBM is our best model.

The variable importance for LightGBM has already been found out using SHAP.

Variable importance summaries for SHAP have also been plotted above(on both training and testing data).

The most important variables are:

  • SCHED_SURG_PROC_CD
  • PREVIOUSSPECIALTYVISIT
  • SCHED_SURG_AREA

  • As can be seen in the force plot, SCHED_SURG_PROC_CD put higher chance of LOS. For both Patient 0 and Patient 101. Though Patient 0 has less chance of LOS > 5, but Patient 101 has very high predicted risk of 1.98. For Patient 101 the risk increased by the decrement of SCHED_SURG_PROC_CD, as it is almost half of what Patient 0 has. So, it can be seen LOS risk is inversely proportional to SCHED_SURG_PROC_CD. It can be seen that for Patient 5010 SCHED_SURG_PROC_CD infact in the lower risk side of the explainer.

    In [ ]: